Jan 23 09:00:49 localhost kernel: Linux version 5.14.0-661.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026
Jan 23 09:00:49 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 23 09:00:49 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 23 09:00:49 localhost kernel: BIOS-provided physical RAM map:
Jan 23 09:00:49 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 23 09:00:49 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 23 09:00:49 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 23 09:00:49 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 23 09:00:49 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 23 09:00:49 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 23 09:00:49 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 23 09:00:49 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 23 09:00:49 localhost kernel: NX (Execute Disable) protection: active
Jan 23 09:00:49 localhost kernel: APIC: Static calls initialized
Jan 23 09:00:49 localhost kernel: SMBIOS 2.8 present.
Jan 23 09:00:49 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 23 09:00:49 localhost kernel: Hypervisor detected: KVM
Jan 23 09:00:49 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 23 09:00:49 localhost kernel: kvm-clock: using sched offset of 3156574041 cycles
Jan 23 09:00:49 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 23 09:00:49 localhost kernel: tsc: Detected 2799.998 MHz processor
Jan 23 09:00:49 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Jan 23 09:00:49 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Jan 23 09:00:49 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 23 09:00:49 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 23 09:00:49 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 23 09:00:49 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 23 09:00:49 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 23 09:00:49 localhost kernel: Using GB pages for direct mapping
Jan 23 09:00:49 localhost kernel: RAMDISK: [mem 0x2d426000-0x32a0afff]
Jan 23 09:00:49 localhost kernel: ACPI: Early table checksum verification disabled
Jan 23 09:00:49 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 23 09:00:49 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 09:00:49 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 09:00:49 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 09:00:49 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 23 09:00:49 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 09:00:49 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 09:00:49 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 23 09:00:49 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 23 09:00:49 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 23 09:00:49 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 23 09:00:49 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 23 09:00:49 localhost kernel: No NUMA configuration found
Jan 23 09:00:49 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 23 09:00:49 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Jan 23 09:00:49 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 23 09:00:49 localhost kernel: Zone ranges:
Jan 23 09:00:49 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 23 09:00:49 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 23 09:00:49 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 23 09:00:49 localhost kernel:   Device   empty
Jan 23 09:00:49 localhost kernel: Movable zone start for each node
Jan 23 09:00:49 localhost kernel: Early memory node ranges
Jan 23 09:00:49 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 23 09:00:49 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 23 09:00:49 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 23 09:00:49 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 23 09:00:49 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 23 09:00:49 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 23 09:00:49 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 23 09:00:49 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Jan 23 09:00:49 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 23 09:00:49 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 23 09:00:49 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 23 09:00:49 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 23 09:00:49 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 23 09:00:49 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 23 09:00:49 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 23 09:00:49 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 23 09:00:49 localhost kernel: TSC deadline timer available
Jan 23 09:00:49 localhost kernel: CPU topo: Max. logical packages:   8
Jan 23 09:00:49 localhost kernel: CPU topo: Max. logical dies:       8
Jan 23 09:00:49 localhost kernel: CPU topo: Max. dies per package:   1
Jan 23 09:00:49 localhost kernel: CPU topo: Max. threads per core:   1
Jan 23 09:00:49 localhost kernel: CPU topo: Num. cores per package:     1
Jan 23 09:00:49 localhost kernel: CPU topo: Num. threads per package:   1
Jan 23 09:00:49 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 23 09:00:49 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 23 09:00:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 23 09:00:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 23 09:00:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 23 09:00:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 23 09:00:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 23 09:00:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 23 09:00:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 23 09:00:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 23 09:00:49 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 23 09:00:49 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 23 09:00:49 localhost kernel: Booting paravirtualized kernel on KVM
Jan 23 09:00:49 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 23 09:00:49 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 23 09:00:49 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 23 09:00:49 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Jan 23 09:00:49 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Jan 23 09:00:49 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 23 09:00:49 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 23 09:00:49 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64", will be passed to user space.
Jan 23 09:00:49 localhost kernel: random: crng init done
Jan 23 09:00:49 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 23 09:00:49 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 23 09:00:49 localhost kernel: Fallback order for Node 0: 0 
Jan 23 09:00:49 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 23 09:00:49 localhost kernel: Policy zone: Normal
Jan 23 09:00:49 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 23 09:00:49 localhost kernel: software IO TLB: area num 8.
Jan 23 09:00:49 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 23 09:00:49 localhost kernel: ftrace: allocating 49417 entries in 194 pages
Jan 23 09:00:49 localhost kernel: ftrace: allocated 194 pages with 3 groups
Jan 23 09:00:49 localhost kernel: Dynamic Preempt: voluntary
Jan 23 09:00:49 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 23 09:00:49 localhost kernel: rcu:         RCU event tracing is enabled.
Jan 23 09:00:49 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 23 09:00:49 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Jan 23 09:00:49 localhost kernel:         Rude variant of Tasks RCU enabled.
Jan 23 09:00:49 localhost kernel:         Tracing variant of Tasks RCU enabled.
Jan 23 09:00:49 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 23 09:00:49 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 23 09:00:49 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 23 09:00:49 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 23 09:00:49 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 23 09:00:49 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 23 09:00:49 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 23 09:00:49 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 23 09:00:49 localhost kernel: Console: colour VGA+ 80x25
Jan 23 09:00:49 localhost kernel: printk: console [ttyS0] enabled
Jan 23 09:00:49 localhost kernel: ACPI: Core revision 20230331
Jan 23 09:00:49 localhost kernel: APIC: Switch to symmetric I/O mode setup
Jan 23 09:00:49 localhost kernel: x2apic enabled
Jan 23 09:00:49 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Jan 23 09:00:49 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 23 09:00:49 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Jan 23 09:00:49 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 23 09:00:49 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 23 09:00:49 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 23 09:00:49 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 23 09:00:49 localhost kernel: Spectre V2 : Mitigation: Retpolines
Jan 23 09:00:49 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 23 09:00:49 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 23 09:00:49 localhost kernel: RETBleed: Mitigation: untrained return thunk
Jan 23 09:00:49 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 23 09:00:49 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 23 09:00:49 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Jan 23 09:00:49 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Jan 23 09:00:49 localhost kernel: x86/bugs: return thunk changed
Jan 23 09:00:49 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Jan 23 09:00:49 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 23 09:00:49 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 23 09:00:49 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 23 09:00:49 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 23 09:00:49 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 23 09:00:49 localhost kernel: Freeing SMP alternatives memory: 40K
Jan 23 09:00:49 localhost kernel: pid_max: default: 32768 minimum: 301
Jan 23 09:00:49 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 23 09:00:49 localhost kernel: landlock: Up and running.
Jan 23 09:00:49 localhost kernel: Yama: becoming mindful.
Jan 23 09:00:49 localhost kernel: SELinux:  Initializing.
Jan 23 09:00:49 localhost kernel: LSM support for eBPF active
Jan 23 09:00:49 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 23 09:00:49 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 23 09:00:49 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 23 09:00:49 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 23 09:00:49 localhost kernel: ... version:                0
Jan 23 09:00:49 localhost kernel: ... bit width:              48
Jan 23 09:00:49 localhost kernel: ... generic registers:      6
Jan 23 09:00:49 localhost kernel: ... value mask:             0000ffffffffffff
Jan 23 09:00:49 localhost kernel: ... max period:             00007fffffffffff
Jan 23 09:00:49 localhost kernel: ... fixed-purpose events:   0
Jan 23 09:00:49 localhost kernel: ... event mask:             000000000000003f
Jan 23 09:00:49 localhost kernel: signal: max sigframe size: 1776
Jan 23 09:00:49 localhost kernel: rcu: Hierarchical SRCU implementation.
Jan 23 09:00:49 localhost kernel: rcu:         Max phase no-delay instances is 400.
Jan 23 09:00:49 localhost kernel: smp: Bringing up secondary CPUs ...
Jan 23 09:00:49 localhost kernel: smpboot: x86: Booting SMP configuration:
Jan 23 09:00:49 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 23 09:00:49 localhost kernel: smp: Brought up 1 node, 8 CPUs
Jan 23 09:00:49 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Jan 23 09:00:49 localhost kernel: node 0 deferred pages initialised in 10ms
Jan 23 09:00:49 localhost kernel: Memory: 7763764K/8388068K available (16384K kernel code, 5797K rwdata, 13916K rodata, 4200K init, 7192K bss, 618356K reserved, 0K cma-reserved)
Jan 23 09:00:49 localhost kernel: devtmpfs: initialized
Jan 23 09:00:49 localhost kernel: x86/mm: Memory block size: 128MB
Jan 23 09:00:49 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 23 09:00:49 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 23 09:00:49 localhost kernel: pinctrl core: initialized pinctrl subsystem
Jan 23 09:00:49 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 23 09:00:49 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 23 09:00:49 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 23 09:00:49 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 23 09:00:49 localhost kernel: audit: initializing netlink subsys (disabled)
Jan 23 09:00:49 localhost kernel: audit: type=2000 audit(1769158847.233:1): state=initialized audit_enabled=0 res=1
Jan 23 09:00:49 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 23 09:00:49 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 23 09:00:49 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 23 09:00:49 localhost kernel: cpuidle: using governor menu
Jan 23 09:00:49 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 23 09:00:49 localhost kernel: PCI: Using configuration type 1 for base access
Jan 23 09:00:49 localhost kernel: PCI: Using configuration type 1 for extended access
Jan 23 09:00:49 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 23 09:00:49 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 23 09:00:49 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 23 09:00:49 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 23 09:00:49 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 23 09:00:49 localhost kernel: Demotion targets for Node 0: null
Jan 23 09:00:49 localhost kernel: cryptd: max_cpu_qlen set to 1000
Jan 23 09:00:49 localhost kernel: ACPI: Added _OSI(Module Device)
Jan 23 09:00:49 localhost kernel: ACPI: Added _OSI(Processor Device)
Jan 23 09:00:49 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 23 09:00:49 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 23 09:00:49 localhost kernel: ACPI: Interpreter enabled
Jan 23 09:00:49 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 23 09:00:49 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Jan 23 09:00:49 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 23 09:00:49 localhost kernel: PCI: Using E820 reservations for host bridge windows
Jan 23 09:00:49 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 23 09:00:49 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 23 09:00:49 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [3] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [4] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [5] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [6] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [7] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [8] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [9] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [10] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [11] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [12] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [13] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [14] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [15] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [16] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [17] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [18] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [19] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [20] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [21] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [22] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [23] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [24] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [25] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [26] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [27] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [28] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [29] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [30] registered
Jan 23 09:00:49 localhost kernel: acpiphp: Slot [31] registered
Jan 23 09:00:49 localhost kernel: PCI host bridge to bus 0000:00
Jan 23 09:00:49 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 23 09:00:49 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 23 09:00:49 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 23 09:00:49 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 23 09:00:49 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 23 09:00:49 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 23 09:00:49 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 23 09:00:49 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 23 09:00:49 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 23 09:00:49 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 23 09:00:49 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 23 09:00:49 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 23 09:00:49 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 23 09:00:49 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 23 09:00:49 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 23 09:00:49 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 23 09:00:49 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 23 09:00:49 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 23 09:00:49 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 23 09:00:49 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 23 09:00:49 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 23 09:00:49 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 23 09:00:49 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 23 09:00:49 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 23 09:00:49 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 23 09:00:49 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 23 09:00:49 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 23 09:00:49 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 23 09:00:49 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 23 09:00:49 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 23 09:00:49 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 23 09:00:49 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 23 09:00:49 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 23 09:00:49 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 23 09:00:49 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 23 09:00:49 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 23 09:00:49 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 23 09:00:49 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 23 09:00:49 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 23 09:00:49 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 23 09:00:49 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 23 09:00:49 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 23 09:00:49 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 23 09:00:49 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 23 09:00:49 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 23 09:00:49 localhost kernel: iommu: Default domain type: Translated
Jan 23 09:00:49 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 23 09:00:49 localhost kernel: SCSI subsystem initialized
Jan 23 09:00:49 localhost kernel: ACPI: bus type USB registered
Jan 23 09:00:49 localhost kernel: usbcore: registered new interface driver usbfs
Jan 23 09:00:49 localhost kernel: usbcore: registered new interface driver hub
Jan 23 09:00:49 localhost kernel: usbcore: registered new device driver usb
Jan 23 09:00:49 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 23 09:00:49 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 23 09:00:49 localhost kernel: PTP clock support registered
Jan 23 09:00:49 localhost kernel: EDAC MC: Ver: 3.0.0
Jan 23 09:00:49 localhost kernel: NetLabel: Initializing
Jan 23 09:00:49 localhost kernel: NetLabel:  domain hash size = 128
Jan 23 09:00:49 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 23 09:00:49 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Jan 23 09:00:49 localhost kernel: PCI: Using ACPI for IRQ routing
Jan 23 09:00:49 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Jan 23 09:00:49 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Jan 23 09:00:49 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Jan 23 09:00:49 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 23 09:00:49 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 23 09:00:49 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 23 09:00:49 localhost kernel: vgaarb: loaded
Jan 23 09:00:49 localhost kernel: clocksource: Switched to clocksource kvm-clock
Jan 23 09:00:49 localhost kernel: VFS: Disk quotas dquot_6.6.0
Jan 23 09:00:49 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 23 09:00:49 localhost kernel: pnp: PnP ACPI init
Jan 23 09:00:49 localhost kernel: pnp 00:03: [dma 2]
Jan 23 09:00:49 localhost kernel: pnp: PnP ACPI: found 5 devices
Jan 23 09:00:49 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 23 09:00:49 localhost kernel: NET: Registered PF_INET protocol family
Jan 23 09:00:49 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 23 09:00:49 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 23 09:00:49 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 23 09:00:49 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 23 09:00:49 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 23 09:00:49 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 23 09:00:49 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 23 09:00:49 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 23 09:00:49 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 23 09:00:49 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 23 09:00:49 localhost kernel: NET: Registered PF_XDP protocol family
Jan 23 09:00:49 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 23 09:00:49 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 23 09:00:49 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 23 09:00:49 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 23 09:00:49 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 23 09:00:49 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 23 09:00:49 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 23 09:00:49 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 23 09:00:49 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 87300 usecs
Jan 23 09:00:49 localhost kernel: PCI: CLS 0 bytes, default 64
Jan 23 09:00:49 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 23 09:00:49 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 23 09:00:49 localhost kernel: ACPI: bus type thunderbolt registered
Jan 23 09:00:49 localhost kernel: Trying to unpack rootfs image as initramfs...
Jan 23 09:00:49 localhost kernel: Initialise system trusted keyrings
Jan 23 09:00:49 localhost kernel: Key type blacklist registered
Jan 23 09:00:49 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 23 09:00:49 localhost kernel: zbud: loaded
Jan 23 09:00:49 localhost kernel: integrity: Platform Keyring initialized
Jan 23 09:00:49 localhost kernel: integrity: Machine keyring initialized
Jan 23 09:00:49 localhost kernel: Freeing initrd memory: 87956K
Jan 23 09:00:49 localhost kernel: NET: Registered PF_ALG protocol family
Jan 23 09:00:49 localhost kernel: xor: automatically using best checksumming function   avx       
Jan 23 09:00:49 localhost kernel: Key type asymmetric registered
Jan 23 09:00:49 localhost kernel: Asymmetric key parser 'x509' registered
Jan 23 09:00:49 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 23 09:00:49 localhost kernel: io scheduler mq-deadline registered
Jan 23 09:00:49 localhost kernel: io scheduler kyber registered
Jan 23 09:00:49 localhost kernel: io scheduler bfq registered
Jan 23 09:00:49 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 23 09:00:49 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 23 09:00:49 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 23 09:00:49 localhost kernel: ACPI: button: Power Button [PWRF]
Jan 23 09:00:49 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 23 09:00:49 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 23 09:00:49 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 23 09:00:49 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 23 09:00:49 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 23 09:00:49 localhost kernel: Non-volatile memory driver v1.3
Jan 23 09:00:49 localhost kernel: rdac: device handler registered
Jan 23 09:00:49 localhost kernel: hp_sw: device handler registered
Jan 23 09:00:49 localhost kernel: emc: device handler registered
Jan 23 09:00:49 localhost kernel: alua: device handler registered
Jan 23 09:00:49 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 23 09:00:49 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 23 09:00:49 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 23 09:00:49 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 23 09:00:49 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 23 09:00:49 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 23 09:00:49 localhost kernel: usb usb1: Product: UHCI Host Controller
Jan 23 09:00:49 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-661.el9.x86_64 uhci_hcd
Jan 23 09:00:49 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 23 09:00:49 localhost kernel: hub 1-0:1.0: USB hub found
Jan 23 09:00:49 localhost kernel: hub 1-0:1.0: 2 ports detected
Jan 23 09:00:49 localhost kernel: usbcore: registered new interface driver usbserial_generic
Jan 23 09:00:49 localhost kernel: usbserial: USB Serial support registered for generic
Jan 23 09:00:49 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 23 09:00:49 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 23 09:00:49 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 23 09:00:49 localhost kernel: mousedev: PS/2 mouse device common for all mice
Jan 23 09:00:49 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 23 09:00:49 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 23 09:00:49 localhost kernel: rtc_cmos 00:04: registered as rtc0
Jan 23 09:00:49 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-01-23T09:00:48 UTC (1769158848)
Jan 23 09:00:49 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 23 09:00:49 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 23 09:00:49 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 23 09:00:49 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 23 09:00:49 localhost kernel: usbcore: registered new interface driver usbhid
Jan 23 09:00:49 localhost kernel: usbhid: USB HID core driver
Jan 23 09:00:49 localhost kernel: drop_monitor: Initializing network drop monitor service
Jan 23 09:00:49 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 23 09:00:49 localhost kernel: Initializing XFRM netlink socket
Jan 23 09:00:49 localhost kernel: NET: Registered PF_INET6 protocol family
Jan 23 09:00:49 localhost kernel: Segment Routing with IPv6
Jan 23 09:00:49 localhost kernel: NET: Registered PF_PACKET protocol family
Jan 23 09:00:49 localhost kernel: mpls_gso: MPLS GSO support
Jan 23 09:00:49 localhost kernel: IPI shorthand broadcast: enabled
Jan 23 09:00:49 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Jan 23 09:00:49 localhost kernel: AES CTR mode by8 optimization enabled
Jan 23 09:00:49 localhost kernel: sched_clock: Marking stable (1273006764, 157430312)->(1579641599, -149204523)
Jan 23 09:00:49 localhost kernel: registered taskstats version 1
Jan 23 09:00:49 localhost kernel: Loading compiled-in X.509 certificates
Jan 23 09:00:49 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 23 09:00:49 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 23 09:00:49 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 23 09:00:49 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 23 09:00:49 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 23 09:00:49 localhost kernel: Demotion targets for Node 0: null
Jan 23 09:00:49 localhost kernel: page_owner is disabled
Jan 23 09:00:49 localhost kernel: Key type .fscrypt registered
Jan 23 09:00:49 localhost kernel: Key type fscrypt-provisioning registered
Jan 23 09:00:49 localhost kernel: Key type big_key registered
Jan 23 09:00:49 localhost kernel: Key type encrypted registered
Jan 23 09:00:49 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 23 09:00:49 localhost kernel: Loading compiled-in module X.509 certificates
Jan 23 09:00:49 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 23 09:00:49 localhost kernel: ima: Allocated hash algorithm: sha256
Jan 23 09:00:49 localhost kernel: ima: No architecture policies found
Jan 23 09:00:49 localhost kernel: evm: Initialising EVM extended attributes:
Jan 23 09:00:49 localhost kernel: evm: security.selinux
Jan 23 09:00:49 localhost kernel: evm: security.SMACK64 (disabled)
Jan 23 09:00:49 localhost kernel: evm: security.SMACK64EXEC (disabled)
Jan 23 09:00:49 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 23 09:00:49 localhost kernel: evm: security.SMACK64MMAP (disabled)
Jan 23 09:00:49 localhost kernel: evm: security.apparmor (disabled)
Jan 23 09:00:49 localhost kernel: evm: security.ima
Jan 23 09:00:49 localhost kernel: evm: security.capability
Jan 23 09:00:49 localhost kernel: evm: HMAC attrs: 0x1
Jan 23 09:00:49 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 23 09:00:49 localhost kernel: Running certificate verification RSA selftest
Jan 23 09:00:49 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 23 09:00:49 localhost kernel: Running certificate verification ECDSA selftest
Jan 23 09:00:49 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 23 09:00:49 localhost kernel: clk: Disabling unused clocks
Jan 23 09:00:49 localhost kernel: Freeing unused decrypted memory: 2028K
Jan 23 09:00:49 localhost kernel: Freeing unused kernel image (initmem) memory: 4200K
Jan 23 09:00:49 localhost kernel: Write protecting the kernel read-only data: 30720k
Jan 23 09:00:49 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Jan 23 09:00:49 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 23 09:00:49 localhost kernel: Run /init as init process
Jan 23 09:00:49 localhost kernel:   with arguments:
Jan 23 09:00:49 localhost kernel:     /init
Jan 23 09:00:49 localhost kernel:   with environment:
Jan 23 09:00:49 localhost kernel:     HOME=/
Jan 23 09:00:49 localhost kernel:     TERM=linux
Jan 23 09:00:49 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64
Jan 23 09:00:49 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 23 09:00:49 localhost systemd[1]: Detected virtualization kvm.
Jan 23 09:00:49 localhost systemd[1]: Detected architecture x86-64.
Jan 23 09:00:49 localhost systemd[1]: Running in initrd.
Jan 23 09:00:49 localhost systemd[1]: No hostname configured, using default hostname.
Jan 23 09:00:49 localhost systemd[1]: Hostname set to <localhost>.
Jan 23 09:00:49 localhost systemd[1]: Initializing machine ID from VM UUID.
Jan 23 09:00:49 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 23 09:00:49 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 23 09:00:49 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Jan 23 09:00:49 localhost kernel: usb 1-1: Manufacturer: QEMU
Jan 23 09:00:49 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 23 09:00:49 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 23 09:00:49 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 23 09:00:49 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Jan 23 09:00:49 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 23 09:00:49 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 23 09:00:49 localhost systemd[1]: Reached target Initrd /usr File System.
Jan 23 09:00:49 localhost systemd[1]: Reached target Local File Systems.
Jan 23 09:00:49 localhost systemd[1]: Reached target Path Units.
Jan 23 09:00:49 localhost systemd[1]: Reached target Slice Units.
Jan 23 09:00:49 localhost systemd[1]: Reached target Swaps.
Jan 23 09:00:49 localhost systemd[1]: Reached target Timer Units.
Jan 23 09:00:49 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 23 09:00:49 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Jan 23 09:00:49 localhost systemd[1]: Listening on Journal Socket.
Jan 23 09:00:49 localhost systemd[1]: Listening on udev Control Socket.
Jan 23 09:00:49 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 23 09:00:49 localhost systemd[1]: Reached target Socket Units.
Jan 23 09:00:49 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 23 09:00:49 localhost systemd[1]: Starting Journal Service...
Jan 23 09:00:49 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 23 09:00:49 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 23 09:00:49 localhost systemd[1]: Starting Create System Users...
Jan 23 09:00:49 localhost systemd[1]: Starting Setup Virtual Console...
Jan 23 09:00:49 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 23 09:00:49 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 23 09:00:49 localhost systemd[1]: Finished Create System Users.
Jan 23 09:00:49 localhost systemd-journald[306]: Journal started
Jan 23 09:00:49 localhost systemd-journald[306]: Runtime Journal (/run/log/journal/84c28ede41124d768f99c7405a7d029c) is 8.0M, max 153.6M, 145.6M free.
Jan 23 09:00:49 localhost systemd-sysusers[311]: Creating group 'users' with GID 100.
Jan 23 09:00:49 localhost systemd-sysusers[311]: Creating group 'dbus' with GID 81.
Jan 23 09:00:49 localhost systemd-sysusers[311]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 23 09:00:49 localhost systemd[1]: Started Journal Service.
Jan 23 09:00:49 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 23 09:00:49 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 23 09:00:49 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 23 09:00:49 localhost systemd[1]: Finished Setup Virtual Console.
Jan 23 09:00:49 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 23 09:00:49 localhost systemd[1]: Starting dracut cmdline hook...
Jan 23 09:00:49 localhost dracut-cmdline[324]: dracut-9 dracut-057-102.git20250818.el9
Jan 23 09:00:49 localhost dracut-cmdline[324]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 23 09:00:49 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 23 09:00:49 localhost systemd[1]: Finished dracut cmdline hook.
Jan 23 09:00:49 localhost systemd[1]: Starting dracut pre-udev hook...
Jan 23 09:00:49 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 23 09:00:49 localhost kernel: device-mapper: uevent: version 1.0.3
Jan 23 09:00:49 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 23 09:00:49 localhost kernel: RPC: Registered named UNIX socket transport module.
Jan 23 09:00:49 localhost kernel: RPC: Registered udp transport module.
Jan 23 09:00:49 localhost kernel: RPC: Registered tcp transport module.
Jan 23 09:00:49 localhost kernel: RPC: Registered tcp-with-tls transport module.
Jan 23 09:00:49 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 23 09:00:49 localhost rpc.statd[442]: Version 2.5.4 starting
Jan 23 09:00:49 localhost rpc.statd[442]: Initializing NSM state
Jan 23 09:00:49 localhost rpc.idmapd[447]: Setting log level to 0
Jan 23 09:00:49 localhost systemd[1]: Finished dracut pre-udev hook.
Jan 23 09:00:49 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 23 09:00:49 localhost systemd-udevd[460]: Using default interface naming scheme 'rhel-9.0'.
Jan 23 09:00:49 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 23 09:00:49 localhost systemd[1]: Starting dracut pre-trigger hook...
Jan 23 09:00:49 localhost systemd[1]: Finished dracut pre-trigger hook.
Jan 23 09:00:49 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 23 09:00:49 localhost systemd[1]: Created slice Slice /system/modprobe.
Jan 23 09:00:49 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 23 09:00:49 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 23 09:00:49 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 23 09:00:49 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 23 09:00:49 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 23 09:00:49 localhost systemd[1]: Reached target Network.
Jan 23 09:00:49 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 23 09:00:49 localhost systemd[1]: Starting dracut initqueue hook...
Jan 23 09:00:49 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 23 09:00:49 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 23 09:00:49 localhost kernel:  vda: vda1
Jan 23 09:00:49 localhost kernel: libata version 3.00 loaded.
Jan 23 09:00:49 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Jan 23 09:00:49 localhost systemd-udevd[482]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:00:49 localhost kernel: scsi host0: ata_piix
Jan 23 09:00:49 localhost kernel: scsi host1: ata_piix
Jan 23 09:00:49 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 23 09:00:49 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 23 09:00:50 localhost systemd[1]: Found device /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 23 09:00:50 localhost systemd[1]: Reached target Initrd Root Device.
Jan 23 09:00:50 localhost systemd[1]: Mounting Kernel Configuration File System...
Jan 23 09:00:50 localhost systemd[1]: Mounted Kernel Configuration File System.
Jan 23 09:00:50 localhost kernel: ata1: found unknown device (class 0)
Jan 23 09:00:50 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 23 09:00:50 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 23 09:00:50 localhost systemd[1]: Reached target System Initialization.
Jan 23 09:00:50 localhost systemd[1]: Reached target Basic System.
Jan 23 09:00:50 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 23 09:00:50 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 23 09:00:50 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 23 09:00:50 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Jan 23 09:00:50 localhost systemd[1]: Finished dracut initqueue hook.
Jan 23 09:00:50 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Jan 23 09:00:50 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Jan 23 09:00:50 localhost systemd[1]: Reached target Remote File Systems.
Jan 23 09:00:50 localhost systemd[1]: Starting dracut pre-mount hook...
Jan 23 09:00:50 localhost systemd[1]: Finished dracut pre-mount hook.
Jan 23 09:00:50 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40...
Jan 23 09:00:50 localhost systemd-fsck[554]: /usr/sbin/fsck.xfs: XFS file system.
Jan 23 09:00:50 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 23 09:00:50 localhost systemd[1]: Mounting /sysroot...
Jan 23 09:00:51 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 23 09:00:51 localhost kernel: XFS (vda1): Mounting V5 Filesystem 22ac9141-3960-4912-b20e-19fc8a328d40
Jan 23 09:00:51 localhost kernel: XFS (vda1): Ending clean mount
Jan 23 09:00:51 localhost systemd[1]: Mounted /sysroot.
Jan 23 09:00:51 localhost systemd[1]: Reached target Initrd Root File System.
Jan 23 09:00:51 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 23 09:00:51 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 23 09:00:51 localhost systemd[1]: Reached target Initrd File Systems.
Jan 23 09:00:51 localhost systemd[1]: Reached target Initrd Default Target.
Jan 23 09:00:51 localhost systemd[1]: Starting dracut mount hook...
Jan 23 09:00:51 localhost systemd[1]: Finished dracut mount hook.
Jan 23 09:00:51 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 23 09:00:51 localhost rpc.idmapd[447]: exiting on signal 15
Jan 23 09:00:51 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 23 09:00:51 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 23 09:00:51 localhost systemd[1]: Stopped target Network.
Jan 23 09:00:51 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 23 09:00:51 localhost systemd[1]: Stopped target Timer Units.
Jan 23 09:00:51 localhost systemd[1]: dbus.socket: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 23 09:00:51 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 23 09:00:51 localhost systemd[1]: Stopped target Initrd Default Target.
Jan 23 09:00:51 localhost systemd[1]: Stopped target Basic System.
Jan 23 09:00:51 localhost systemd[1]: Stopped target Initrd Root Device.
Jan 23 09:00:51 localhost systemd[1]: Stopped target Initrd /usr File System.
Jan 23 09:00:51 localhost systemd[1]: Stopped target Path Units.
Jan 23 09:00:51 localhost systemd[1]: Stopped target Remote File Systems.
Jan 23 09:00:51 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 23 09:00:51 localhost systemd[1]: Stopped target Slice Units.
Jan 23 09:00:51 localhost systemd[1]: Stopped target Socket Units.
Jan 23 09:00:51 localhost systemd[1]: Stopped target System Initialization.
Jan 23 09:00:51 localhost systemd[1]: Stopped target Local File Systems.
Jan 23 09:00:51 localhost systemd[1]: Stopped target Swaps.
Jan 23 09:00:51 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Stopped dracut mount hook.
Jan 23 09:00:51 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Stopped dracut pre-mount hook.
Jan 23 09:00:51 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Jan 23 09:00:51 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 23 09:00:51 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Stopped dracut initqueue hook.
Jan 23 09:00:51 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Stopped Apply Kernel Variables.
Jan 23 09:00:51 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Jan 23 09:00:51 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Stopped Coldplug All udev Devices.
Jan 23 09:00:51 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Stopped dracut pre-trigger hook.
Jan 23 09:00:51 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 23 09:00:51 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Stopped Setup Virtual Console.
Jan 23 09:00:51 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 23 09:00:51 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Closed udev Control Socket.
Jan 23 09:00:51 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Closed udev Kernel Socket.
Jan 23 09:00:51 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Stopped dracut pre-udev hook.
Jan 23 09:00:51 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Stopped dracut cmdline hook.
Jan 23 09:00:51 localhost systemd[1]: Starting Cleanup udev Database...
Jan 23 09:00:51 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 23 09:00:51 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Jan 23 09:00:51 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Stopped Create System Users.
Jan 23 09:00:51 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 23 09:00:51 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Finished Cleanup udev Database.
Jan 23 09:00:51 localhost systemd[1]: Reached target Switch Root.
Jan 23 09:00:51 localhost systemd[1]: Starting Switch Root...
Jan 23 09:00:51 localhost systemd[1]: Switching root.
Jan 23 09:00:51 localhost systemd-journald[306]: Journal stopped
Jan 23 09:00:52 localhost systemd-journald[306]: Received SIGTERM from PID 1 (systemd).
Jan 23 09:00:52 localhost kernel: audit: type=1404 audit(1769158851.471:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 23 09:00:52 localhost kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 09:00:52 localhost kernel: SELinux:  policy capability open_perms=1
Jan 23 09:00:52 localhost kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 09:00:52 localhost kernel: SELinux:  policy capability always_check_network=0
Jan 23 09:00:52 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 09:00:52 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 09:00:52 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 09:00:52 localhost kernel: audit: type=1403 audit(1769158851.596:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 23 09:00:52 localhost systemd[1]: Successfully loaded SELinux policy in 127.604ms.
Jan 23 09:00:52 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 27.947ms.
Jan 23 09:00:52 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 23 09:00:52 localhost systemd[1]: Detected virtualization kvm.
Jan 23 09:00:52 localhost systemd[1]: Detected architecture x86-64.
Jan 23 09:00:52 localhost systemd-rc-local-generator[638]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:00:52 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Jan 23 09:00:52 localhost systemd[1]: Stopped Switch Root.
Jan 23 09:00:52 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 23 09:00:52 localhost systemd[1]: Created slice Slice /system/getty.
Jan 23 09:00:52 localhost systemd[1]: Created slice Slice /system/serial-getty.
Jan 23 09:00:52 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Jan 23 09:00:52 localhost systemd[1]: Created slice User and Session Slice.
Jan 23 09:00:52 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 23 09:00:52 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Jan 23 09:00:52 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 23 09:00:52 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 23 09:00:52 localhost systemd[1]: Stopped target Switch Root.
Jan 23 09:00:52 localhost systemd[1]: Stopped target Initrd File Systems.
Jan 23 09:00:52 localhost systemd[1]: Stopped target Initrd Root File System.
Jan 23 09:00:52 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Jan 23 09:00:52 localhost systemd[1]: Reached target Path Units.
Jan 23 09:00:52 localhost systemd[1]: Reached target rpc_pipefs.target.
Jan 23 09:00:52 localhost systemd[1]: Reached target Slice Units.
Jan 23 09:00:52 localhost systemd[1]: Reached target Swaps.
Jan 23 09:00:52 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Jan 23 09:00:52 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Jan 23 09:00:52 localhost systemd[1]: Reached target RPC Port Mapper.
Jan 23 09:00:52 localhost systemd[1]: Listening on Process Core Dump Socket.
Jan 23 09:00:52 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Jan 23 09:00:52 localhost systemd[1]: Listening on udev Control Socket.
Jan 23 09:00:52 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 23 09:00:52 localhost systemd[1]: Mounting Huge Pages File System...
Jan 23 09:00:52 localhost systemd[1]: Mounting POSIX Message Queue File System...
Jan 23 09:00:52 localhost systemd[1]: Mounting Kernel Debug File System...
Jan 23 09:00:52 localhost systemd[1]: Mounting Kernel Trace File System...
Jan 23 09:00:52 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 23 09:00:52 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 23 09:00:52 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 23 09:00:52 localhost systemd[1]: Starting Load Kernel Module drm...
Jan 23 09:00:52 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Jan 23 09:00:52 localhost systemd[1]: Starting Load Kernel Module fuse...
Jan 23 09:00:52 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 23 09:00:52 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Jan 23 09:00:52 localhost systemd[1]: Stopped File System Check on Root Device.
Jan 23 09:00:52 localhost systemd[1]: Stopped Journal Service.
Jan 23 09:00:52 localhost kernel: fuse: init (API version 7.37)
Jan 23 09:00:52 localhost systemd[1]: Starting Journal Service...
Jan 23 09:00:52 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 23 09:00:52 localhost systemd[1]: Starting Generate network units from Kernel command line...
Jan 23 09:00:52 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 23 09:00:52 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Jan 23 09:00:52 localhost systemd-journald[679]: Journal started
Jan 23 09:00:52 localhost systemd-journald[679]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 23 09:00:51 localhost systemd[1]: Queued start job for default target Multi-User System.
Jan 23 09:00:51 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 23 09:00:52 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 23 09:00:52 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 23 09:00:52 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 23 09:00:52 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 23 09:00:52 localhost systemd[1]: Started Journal Service.
Jan 23 09:00:52 localhost systemd[1]: Mounted Huge Pages File System.
Jan 23 09:00:52 localhost systemd[1]: Mounted POSIX Message Queue File System.
Jan 23 09:00:52 localhost systemd[1]: Mounted Kernel Debug File System.
Jan 23 09:00:52 localhost systemd[1]: Mounted Kernel Trace File System.
Jan 23 09:00:52 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 23 09:00:52 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 23 09:00:52 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 23 09:00:52 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 23 09:00:52 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 23 09:00:52 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 23 09:00:52 localhost systemd[1]: Finished Load Kernel Module fuse.
Jan 23 09:00:52 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 23 09:00:52 localhost systemd[1]: Finished Generate network units from Kernel command line.
Jan 23 09:00:52 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 23 09:00:52 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 23 09:00:52 localhost systemd[1]: Mounting FUSE Control File System...
Jan 23 09:00:52 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 23 09:00:52 localhost systemd[1]: Starting Rebuild Hardware Database...
Jan 23 09:00:52 localhost kernel: ACPI: bus type drm_connector registered
Jan 23 09:00:52 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 23 09:00:52 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 23 09:00:52 localhost systemd[1]: Starting Load/Save OS Random Seed...
Jan 23 09:00:52 localhost systemd[1]: Starting Create System Users...
Jan 23 09:00:52 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 23 09:00:52 localhost systemd[1]: Finished Load Kernel Module drm.
Jan 23 09:00:52 localhost systemd-journald[679]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 23 09:00:52 localhost systemd-journald[679]: Received client request to flush runtime journal.
Jan 23 09:00:52 localhost systemd[1]: Mounted FUSE Control File System.
Jan 23 09:00:52 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 23 09:00:52 localhost systemd[1]: Finished Load/Save OS Random Seed.
Jan 23 09:00:52 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 23 09:00:52 localhost systemd[1]: Finished Create System Users.
Jan 23 09:00:52 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 23 09:00:52 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 23 09:00:52 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 23 09:00:52 localhost systemd[1]: Reached target Preparation for Local File Systems.
Jan 23 09:00:52 localhost systemd[1]: Reached target Local File Systems.
Jan 23 09:00:52 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 23 09:00:52 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 23 09:00:52 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 23 09:00:52 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 23 09:00:52 localhost systemd[1]: Starting Automatic Boot Loader Update...
Jan 23 09:00:52 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 23 09:00:52 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 23 09:00:52 localhost bootctl[697]: Couldn't find EFI system partition, skipping.
Jan 23 09:00:52 localhost systemd[1]: Finished Automatic Boot Loader Update.
Jan 23 09:00:52 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 23 09:00:52 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 23 09:00:52 localhost systemd[1]: Starting Security Auditing Service...
Jan 23 09:00:52 localhost systemd[1]: Starting RPC Bind...
Jan 23 09:00:52 localhost systemd[1]: Starting Rebuild Journal Catalog...
Jan 23 09:00:52 localhost auditd[703]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 23 09:00:52 localhost auditd[703]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 23 09:00:52 localhost systemd[1]: Started RPC Bind.
Jan 23 09:00:52 localhost systemd[1]: Finished Rebuild Journal Catalog.
Jan 23 09:00:52 localhost augenrules[708]: /sbin/augenrules: No change
Jan 23 09:00:52 localhost augenrules[723]: No rules
Jan 23 09:00:52 localhost augenrules[723]: enabled 1
Jan 23 09:00:52 localhost augenrules[723]: failure 1
Jan 23 09:00:52 localhost augenrules[723]: pid 703
Jan 23 09:00:52 localhost augenrules[723]: rate_limit 0
Jan 23 09:00:52 localhost augenrules[723]: backlog_limit 8192
Jan 23 09:00:52 localhost augenrules[723]: lost 0
Jan 23 09:00:52 localhost augenrules[723]: backlog 2
Jan 23 09:00:52 localhost augenrules[723]: backlog_wait_time 60000
Jan 23 09:00:52 localhost augenrules[723]: backlog_wait_time_actual 0
Jan 23 09:00:52 localhost augenrules[723]: enabled 1
Jan 23 09:00:52 localhost augenrules[723]: failure 1
Jan 23 09:00:52 localhost augenrules[723]: pid 703
Jan 23 09:00:52 localhost augenrules[723]: rate_limit 0
Jan 23 09:00:52 localhost augenrules[723]: backlog_limit 8192
Jan 23 09:00:52 localhost augenrules[723]: lost 0
Jan 23 09:00:52 localhost augenrules[723]: backlog 2
Jan 23 09:00:52 localhost augenrules[723]: backlog_wait_time 60000
Jan 23 09:00:52 localhost augenrules[723]: backlog_wait_time_actual 0
Jan 23 09:00:52 localhost augenrules[723]: enabled 1
Jan 23 09:00:52 localhost augenrules[723]: failure 1
Jan 23 09:00:52 localhost augenrules[723]: pid 703
Jan 23 09:00:52 localhost augenrules[723]: rate_limit 0
Jan 23 09:00:52 localhost augenrules[723]: backlog_limit 8192
Jan 23 09:00:52 localhost augenrules[723]: lost 0
Jan 23 09:00:52 localhost augenrules[723]: backlog 1
Jan 23 09:00:52 localhost augenrules[723]: backlog_wait_time 60000
Jan 23 09:00:52 localhost augenrules[723]: backlog_wait_time_actual 0
Jan 23 09:00:52 localhost systemd[1]: Started Security Auditing Service.
Jan 23 09:00:52 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 23 09:00:52 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 23 09:00:52 localhost systemd[1]: Finished Rebuild Hardware Database.
Jan 23 09:00:52 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 23 09:00:52 localhost systemd[1]: Starting Update is Completed...
Jan 23 09:00:52 localhost systemd[1]: Finished Update is Completed.
Jan 23 09:00:53 localhost systemd-udevd[731]: Using default interface naming scheme 'rhel-9.0'.
Jan 23 09:00:53 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 23 09:00:53 localhost systemd[1]: Reached target System Initialization.
Jan 23 09:00:53 localhost systemd[1]: Started dnf makecache --timer.
Jan 23 09:00:53 localhost systemd[1]: Started Daily rotation of log files.
Jan 23 09:00:53 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 23 09:00:53 localhost systemd[1]: Reached target Timer Units.
Jan 23 09:00:53 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 23 09:00:53 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 23 09:00:53 localhost systemd[1]: Reached target Socket Units.
Jan 23 09:00:53 localhost systemd[1]: Starting D-Bus System Message Bus...
Jan 23 09:00:53 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 23 09:00:53 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 23 09:00:53 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 23 09:00:53 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 23 09:00:53 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 23 09:00:53 localhost systemd-udevd[734]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:00:53 localhost systemd[1]: Started D-Bus System Message Bus.
Jan 23 09:00:53 localhost systemd[1]: Reached target Basic System.
Jan 23 09:00:53 localhost dbus-broker-lau[744]: Ready
Jan 23 09:00:53 localhost systemd[1]: Starting NTP client/server...
Jan 23 09:00:53 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 23 09:00:53 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 23 09:00:53 localhost systemd[1]: Starting IPv4 firewall with iptables...
Jan 23 09:00:53 localhost systemd[1]: Started irqbalance daemon.
Jan 23 09:00:53 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 23 09:00:53 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 09:00:53 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 09:00:53 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 09:00:53 localhost systemd[1]: Reached target sshd-keygen.target.
Jan 23 09:00:53 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 23 09:00:53 localhost systemd[1]: Reached target User and Group Name Lookups.
Jan 23 09:00:53 localhost systemd[1]: Starting User Login Management...
Jan 23 09:00:53 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 23 09:00:53 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 23 09:00:53 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 23 09:00:53 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 23 09:00:53 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 23 09:00:53 localhost chronyd[794]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 23 09:00:53 localhost chronyd[794]: Loaded 0 symmetric keys
Jan 23 09:00:53 localhost chronyd[794]: Using right/UTC timezone to obtain leap second data
Jan 23 09:00:53 localhost chronyd[794]: Loaded seccomp filter (level 2)
Jan 23 09:00:53 localhost systemd[1]: Started NTP client/server.
Jan 23 09:00:53 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 23 09:00:53 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 23 09:00:53 localhost systemd-logind[786]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 23 09:00:53 localhost systemd-logind[786]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 23 09:00:53 localhost systemd-logind[786]: New seat seat0.
Jan 23 09:00:53 localhost systemd[1]: Started User Login Management.
Jan 23 09:00:53 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 23 09:00:53 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 23 09:00:53 localhost kernel: kvm_amd: TSC scaling supported
Jan 23 09:00:53 localhost kernel: kvm_amd: Nested Virtualization enabled
Jan 23 09:00:53 localhost kernel: kvm_amd: Nested Paging enabled
Jan 23 09:00:53 localhost kernel: kvm_amd: LBR virtualization supported
Jan 23 09:00:53 localhost kernel: Console: switching to colour dummy device 80x25
Jan 23 09:00:53 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 23 09:00:53 localhost kernel: [drm] features: -context_init
Jan 23 09:00:53 localhost kernel: [drm] number of scanouts: 1
Jan 23 09:00:53 localhost kernel: [drm] number of cap sets: 0
Jan 23 09:00:53 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 23 09:00:53 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 23 09:00:53 localhost kernel: Console: switching to colour frame buffer device 128x48
Jan 23 09:00:53 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 23 09:00:53 localhost iptables.init[779]: iptables: Applying firewall rules: [  OK  ]
Jan 23 09:00:53 localhost systemd[1]: Finished IPv4 firewall with iptables.
Jan 23 09:00:53 localhost cloud-init[840]: Cloud-init v. 24.4-8.el9 running 'init-local' at Fri, 23 Jan 2026 09:00:53 +0000. Up 6.31 seconds.
Jan 23 09:00:53 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Jan 23 09:00:53 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Jan 23 09:00:53 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpgb1h963_.mount: Deactivated successfully.
Jan 23 09:00:53 localhost systemd[1]: Starting Hostname Service...
Jan 23 09:00:53 localhost systemd[1]: Started Hostname Service.
Jan 23 09:00:53 np0005593295.novalocal systemd-hostnamed[854]: Hostname set to <np0005593295.novalocal> (static)
Jan 23 09:00:54 np0005593295.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 23 09:00:54 np0005593295.novalocal systemd[1]: Reached target Preparation for Network.
Jan 23 09:00:54 np0005593295.novalocal systemd[1]: Starting Network Manager...
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.0905] NetworkManager (version 1.54.3-2.el9) is starting... (boot:20df1b08-a5ba-4a35-8d47-00aa8e9b2616)
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.0911] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.0982] manager[0x5566bace9000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1015] hostname: hostname: using hostnamed
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1016] hostname: static hostname changed from (none) to "np0005593295.novalocal"
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1021] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1115] manager[0x5566bace9000]: rfkill: Wi-Fi hardware radio set enabled
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1116] manager[0x5566bace9000]: rfkill: WWAN hardware radio set enabled
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1156] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1157] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1157] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1158] manager: Networking is enabled by state file
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1160] settings: Loaded settings plugin: keyfile (internal)
Jan 23 09:00:54 np0005593295.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1187] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1205] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1215] dhcp: init: Using DHCP client 'internal'
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1218] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1232] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1239] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1246] device (lo): Activation: starting connection 'lo' (a94dd518-f501-4cf9-bb13-731d2edd38ea)
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1254] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1256] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1282] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1288] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1290] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1292] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1294] device (eth0): carrier: link connected
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1297] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1302] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1306] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1310] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1311] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1312] manager: NetworkManager state is now CONNECTING
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1313] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1319] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1321] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 09:00:54 np0005593295.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 09:00:54 np0005593295.novalocal systemd[1]: Started Network Manager.
Jan 23 09:00:54 np0005593295.novalocal systemd[1]: Reached target Network.
Jan 23 09:00:54 np0005593295.novalocal systemd[1]: Starting Network Manager Wait Online...
Jan 23 09:00:54 np0005593295.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 23 09:00:54 np0005593295.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1502] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1506] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.1511] device (lo): Activation: successful, device activated.
Jan 23 09:00:54 np0005593295.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Jan 23 09:00:54 np0005593295.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 23 09:00:54 np0005593295.novalocal systemd[1]: Reached target NFS client services.
Jan 23 09:00:54 np0005593295.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Jan 23 09:00:54 np0005593295.novalocal systemd[1]: Reached target Remote File Systems.
Jan 23 09:00:54 np0005593295.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.3981] dhcp4 (eth0): state changed new lease, address=38.129.56.185
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.3992] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.4008] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.4052] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.4054] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.4056] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.4060] device (eth0): Activation: successful, device activated.
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.4064] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 23 09:00:54 np0005593295.novalocal NetworkManager[858]: <info>  [1769158854.4067] manager: startup complete
Jan 23 09:00:54 np0005593295.novalocal systemd[1]: Finished Network Manager Wait Online.
Jan 23 09:00:54 np0005593295.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Jan 23 09:00:54 np0005593295.novalocal cloud-init[921]: Cloud-init v. 24.4-8.el9 running 'init' at Fri, 23 Jan 2026 09:00:54 +0000. Up 7.42 seconds.
Jan 23 09:00:54 np0005593295.novalocal cloud-init[921]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 23 09:00:54 np0005593295.novalocal cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 23 09:00:54 np0005593295.novalocal cloud-init[921]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 23 09:00:54 np0005593295.novalocal cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 23 09:00:54 np0005593295.novalocal cloud-init[921]: ci-info: |  eth0  | True |        38.129.56.185         | 255.255.255.0 | global | fa:16:3e:ac:8a:5e |
Jan 23 09:00:54 np0005593295.novalocal cloud-init[921]: ci-info: |  eth0  | True | fe80::f816:3eff:feac:8a5e/64 |       .       |  link  | fa:16:3e:ac:8a:5e |
Jan 23 09:00:54 np0005593295.novalocal cloud-init[921]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 23 09:00:54 np0005593295.novalocal cloud-init[921]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 23 09:00:54 np0005593295.novalocal cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 23 09:00:54 np0005593295.novalocal cloud-init[921]: ci-info: ++++++++++++++++++++++++++++++++Route IPv4 info++++++++++++++++++++++++++++++++
Jan 23 09:00:54 np0005593295.novalocal cloud-init[921]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Jan 23 09:00:54 np0005593295.novalocal cloud-init[921]: ci-info: | Route |   Destination   |   Gateway   |     Genmask     | Interface | Flags |
Jan 23 09:00:54 np0005593295.novalocal cloud-init[921]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Jan 23 09:00:54 np0005593295.novalocal cloud-init[921]: ci-info: |   0   |     0.0.0.0     | 38.129.56.1 |     0.0.0.0     |    eth0   |   UG  |
Jan 23 09:00:54 np0005593295.novalocal cloud-init[921]: ci-info: |   1   |   38.129.56.0   |   0.0.0.0   |  255.255.255.0  |    eth0   |   U   |
Jan 23 09:00:54 np0005593295.novalocal cloud-init[921]: ci-info: |   2   | 169.254.169.254 | 38.129.56.5 | 255.255.255.255 |    eth0   |  UGH  |
Jan 23 09:00:54 np0005593295.novalocal cloud-init[921]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Jan 23 09:00:54 np0005593295.novalocal cloud-init[921]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 23 09:00:54 np0005593295.novalocal cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 23 09:00:54 np0005593295.novalocal cloud-init[921]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 23 09:00:54 np0005593295.novalocal cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 23 09:00:54 np0005593295.novalocal cloud-init[921]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 23 09:00:54 np0005593295.novalocal cloud-init[921]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Jan 23 09:00:54 np0005593295.novalocal cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 23 09:00:55 np0005593295.novalocal useradd[987]: new group: name=cloud-user, GID=1001
Jan 23 09:00:55 np0005593295.novalocal useradd[987]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Jan 23 09:00:55 np0005593295.novalocal useradd[987]: add 'cloud-user' to group 'adm'
Jan 23 09:00:55 np0005593295.novalocal useradd[987]: add 'cloud-user' to group 'systemd-journal'
Jan 23 09:00:55 np0005593295.novalocal useradd[987]: add 'cloud-user' to shadow group 'adm'
Jan 23 09:00:55 np0005593295.novalocal useradd[987]: add 'cloud-user' to shadow group 'systemd-journal'
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: Generating public/private rsa key pair.
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: The key fingerprint is:
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: SHA256:m8AhXO3GeQnorWC7FsKZGOrxihbMoSkDu/Ok29CyBEg root@np0005593295.novalocal
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: The key's randomart image is:
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: +---[RSA 3072]----+
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: |      .o         |
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: |   . .. o        |
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: | E  o..+ o .     |
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: |=.  oo..* o      |
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: |O*.+ oooS.       |
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: |X*= o .. o       |
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: |=+*. o  o        |
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: |=X .o            |
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: |*=+.             |
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: +----[SHA256]-----+
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: Generating public/private ecdsa key pair.
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: The key fingerprint is:
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: SHA256:qp30oz4J6SGx4H6Mw94XvmoahRSp8LSPmsJ+PzjTuFo root@np0005593295.novalocal
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: The key's randomart image is:
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: +---[ECDSA 256]---+
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: | ..              |
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: |..o              |
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: |o+ .             |
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: |+ =              |
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: |.o * .  S        |
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: | .= =. .         |
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: |++oE=o+.         |
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: |+*+X.Ooo.        |
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: |+=B+O+Bo..       |
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: +----[SHA256]-----+
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: Generating public/private ed25519 key pair.
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: The key fingerprint is:
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: SHA256:AWwv715Qz77GtzINhRcbXo5PsXIeyKYGXwjonJlEnTQ root@np0005593295.novalocal
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: The key's randomart image is:
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: +--[ED25519 256]--+
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: |     .o.+E.      |
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: |      o+ +.   o..|
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: |     .+.= o oo.Bo|
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: |      .*.+ +.**+o|
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: |       oS o *o+o.|
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: |        .. =.  ..|
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: |       .  o..o   |
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: |        ..  =.o  |
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: |       ..  ..+.. |
Jan 23 09:00:55 np0005593295.novalocal cloud-init[921]: +----[SHA256]-----+
Jan 23 09:00:56 np0005593295.novalocal sm-notify[1003]: Version 2.5.4 starting
Jan 23 09:00:55 np0005593295.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Jan 23 09:00:56 np0005593295.novalocal sshd[1005]: Server listening on 0.0.0.0 port 22.
Jan 23 09:00:56 np0005593295.novalocal systemd[1]: Reached target Cloud-config availability.
Jan 23 09:00:56 np0005593295.novalocal sshd[1005]: Server listening on :: port 22.
Jan 23 09:00:56 np0005593295.novalocal systemd[1]: Reached target Network is Online.
Jan 23 09:00:56 np0005593295.novalocal crond[1009]: (CRON) STARTUP (1.5.7)
Jan 23 09:00:56 np0005593295.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Jan 23 09:00:56 np0005593295.novalocal crond[1009]: (CRON) INFO (Syslog will be used instead of sendmail.)
Jan 23 09:00:56 np0005593295.novalocal systemd[1]: Starting Crash recovery kernel arming...
Jan 23 09:00:56 np0005593295.novalocal crond[1009]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 73% if used.)
Jan 23 09:00:56 np0005593295.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Jan 23 09:00:56 np0005593295.novalocal crond[1009]: (CRON) INFO (running with inotify support)
Jan 23 09:00:56 np0005593295.novalocal systemd[1]: Starting System Logging Service...
Jan 23 09:00:56 np0005593295.novalocal systemd[1]: Starting OpenSSH server daemon...
Jan 23 09:00:56 np0005593295.novalocal systemd[1]: Starting Permit User Sessions...
Jan 23 09:00:56 np0005593295.novalocal systemd[1]: Started Notify NFS peers of a restart.
Jan 23 09:00:56 np0005593295.novalocal systemd[1]: Finished Permit User Sessions.
Jan 23 09:00:56 np0005593295.novalocal systemd[1]: Started OpenSSH server daemon.
Jan 23 09:00:56 np0005593295.novalocal systemd[1]: Started Command Scheduler.
Jan 23 09:00:56 np0005593295.novalocal systemd[1]: Started Getty on tty1.
Jan 23 09:00:56 np0005593295.novalocal systemd[1]: Started Serial Getty on ttyS0.
Jan 23 09:00:56 np0005593295.novalocal systemd[1]: Reached target Login Prompts.
Jan 23 09:00:56 np0005593295.novalocal rsyslogd[1004]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1004" x-info="https://www.rsyslog.com"] start
Jan 23 09:00:56 np0005593295.novalocal rsyslogd[1004]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 23 09:00:56 np0005593295.novalocal systemd[1]: Started System Logging Service.
Jan 23 09:00:56 np0005593295.novalocal systemd[1]: Reached target Multi-User System.
Jan 23 09:00:56 np0005593295.novalocal sshd-session[1010]: Connection reset by 38.102.83.114 port 45674 [preauth]
Jan 23 09:00:56 np0005593295.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 23 09:00:56 np0005593295.novalocal sshd-session[1020]: Unable to negotiate with 38.102.83.114 port 45680: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Jan 23 09:00:56 np0005593295.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 23 09:00:56 np0005593295.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 23 09:00:56 np0005593295.novalocal rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 09:00:56 np0005593295.novalocal kdumpctl[1019]: kdump: No kdump initial ramdisk found.
Jan 23 09:00:56 np0005593295.novalocal kdumpctl[1019]: kdump: Rebuilding /boot/initramfs-5.14.0-661.el9.x86_64kdump.img
Jan 23 09:00:56 np0005593295.novalocal sshd-session[1091]: Unable to negotiate with 38.102.83.114 port 45702: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Jan 23 09:00:56 np0005593295.novalocal sshd-session[1105]: Unable to negotiate with 38.102.83.114 port 45708: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Jan 23 09:00:56 np0005593295.novalocal sshd-session[1111]: Connection closed by 38.102.83.114 port 45710 [preauth]
Jan 23 09:00:56 np0005593295.novalocal sshd-session[1144]: Unable to negotiate with 38.102.83.114 port 45736: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Jan 23 09:00:56 np0005593295.novalocal sshd-session[1030]: Connection closed by 38.102.83.114 port 45694 [preauth]
Jan 23 09:00:56 np0005593295.novalocal sshd-session[1152]: Unable to negotiate with 38.102.83.114 port 45746: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Jan 23 09:00:56 np0005593295.novalocal sshd-session[1128]: Connection closed by 38.102.83.114 port 45722 [preauth]
Jan 23 09:00:56 np0005593295.novalocal cloud-init[1188]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Fri, 23 Jan 2026 09:00:56 +0000. Up 9.08 seconds.
Jan 23 09:00:56 np0005593295.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Jan 23 09:00:56 np0005593295.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Jan 23 09:00:56 np0005593295.novalocal dracut[1284]: dracut-057-102.git20250818.el9
Jan 23 09:00:56 np0005593295.novalocal dracut[1286]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-661.el9.x86_64kdump.img 5.14.0-661.el9.x86_64
Jan 23 09:00:56 np0005593295.novalocal cloud-init[1349]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Fri, 23 Jan 2026 09:00:56 +0000. Up 9.54 seconds.
Jan 23 09:00:56 np0005593295.novalocal cloud-init[1362]: #############################################################
Jan 23 09:00:56 np0005593295.novalocal cloud-init[1366]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 23 09:00:56 np0005593295.novalocal cloud-init[1368]: 256 SHA256:qp30oz4J6SGx4H6Mw94XvmoahRSp8LSPmsJ+PzjTuFo root@np0005593295.novalocal (ECDSA)
Jan 23 09:00:56 np0005593295.novalocal cloud-init[1373]: 256 SHA256:AWwv715Qz77GtzINhRcbXo5PsXIeyKYGXwjonJlEnTQ root@np0005593295.novalocal (ED25519)
Jan 23 09:00:56 np0005593295.novalocal cloud-init[1375]: 3072 SHA256:m8AhXO3GeQnorWC7FsKZGOrxihbMoSkDu/Ok29CyBEg root@np0005593295.novalocal (RSA)
Jan 23 09:00:56 np0005593295.novalocal cloud-init[1377]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 23 09:00:56 np0005593295.novalocal cloud-init[1380]: #############################################################
Jan 23 09:00:57 np0005593295.novalocal cloud-init[1349]: Cloud-init v. 24.4-8.el9 finished at Fri, 23 Jan 2026 09:00:57 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 9.74 seconds
Jan 23 09:00:57 np0005593295.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Jan 23 09:00:57 np0005593295.novalocal systemd[1]: Reached target Cloud-init target.
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: Module 'resume' will not be installed, because it's in the list to be omitted!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: memstrack is not available
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 23 09:00:57 np0005593295.novalocal dracut[1286]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 23 09:00:58 np0005593295.novalocal dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 23 09:00:58 np0005593295.novalocal dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 23 09:00:58 np0005593295.novalocal dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 23 09:00:58 np0005593295.novalocal dracut[1286]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 23 09:00:58 np0005593295.novalocal dracut[1286]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 23 09:00:58 np0005593295.novalocal dracut[1286]: memstrack is not available
Jan 23 09:00:58 np0005593295.novalocal dracut[1286]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 23 09:00:58 np0005593295.novalocal dracut[1286]: *** Including module: systemd ***
Jan 23 09:00:58 np0005593295.novalocal dracut[1286]: *** Including module: fips ***
Jan 23 09:00:58 np0005593295.novalocal dracut[1286]: *** Including module: systemd-initrd ***
Jan 23 09:00:58 np0005593295.novalocal dracut[1286]: *** Including module: i18n ***
Jan 23 09:00:58 np0005593295.novalocal dracut[1286]: *** Including module: drm ***
Jan 23 09:00:59 np0005593295.novalocal chronyd[794]: Selected source 162.159.200.123 (2.centos.pool.ntp.org)
Jan 23 09:00:59 np0005593295.novalocal chronyd[794]: System clock TAI offset set to 37 seconds
Jan 23 09:00:59 np0005593295.novalocal dracut[1286]: *** Including module: prefixdevname ***
Jan 23 09:00:59 np0005593295.novalocal dracut[1286]: *** Including module: kernel-modules ***
Jan 23 09:00:59 np0005593295.novalocal kernel: block vda: the capability attribute has been deprecated.
Jan 23 09:00:59 np0005593295.novalocal dracut[1286]: *** Including module: kernel-modules-extra ***
Jan 23 09:00:59 np0005593295.novalocal dracut[1286]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Jan 23 09:00:59 np0005593295.novalocal dracut[1286]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Jan 23 09:00:59 np0005593295.novalocal dracut[1286]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Jan 23 09:00:59 np0005593295.novalocal dracut[1286]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Jan 23 09:00:59 np0005593295.novalocal dracut[1286]: *** Including module: qemu ***
Jan 23 09:00:59 np0005593295.novalocal dracut[1286]: *** Including module: fstab-sys ***
Jan 23 09:00:59 np0005593295.novalocal dracut[1286]: *** Including module: rootfs-block ***
Jan 23 09:00:59 np0005593295.novalocal dracut[1286]: *** Including module: terminfo ***
Jan 23 09:01:00 np0005593295.novalocal dracut[1286]: *** Including module: udev-rules ***
Jan 23 09:01:00 np0005593295.novalocal dracut[1286]: Skipping udev rule: 91-permissions.rules
Jan 23 09:01:00 np0005593295.novalocal dracut[1286]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 23 09:01:00 np0005593295.novalocal dracut[1286]: *** Including module: virtiofs ***
Jan 23 09:01:00 np0005593295.novalocal dracut[1286]: *** Including module: dracut-systemd ***
Jan 23 09:01:00 np0005593295.novalocal dracut[1286]: *** Including module: usrmount ***
Jan 23 09:01:00 np0005593295.novalocal dracut[1286]: *** Including module: base ***
Jan 23 09:01:00 np0005593295.novalocal dracut[1286]: *** Including module: fs-lib ***
Jan 23 09:01:00 np0005593295.novalocal dracut[1286]: *** Including module: kdumpbase ***
Jan 23 09:01:01 np0005593295.novalocal CROND[2751]: (root) CMD (run-parts /etc/cron.hourly)
Jan 23 09:01:01 np0005593295.novalocal run-parts[2761]: (/etc/cron.hourly) starting 0anacron
Jan 23 09:01:01 np0005593295.novalocal anacron[2783]: Anacron started on 2026-01-23
Jan 23 09:01:01 np0005593295.novalocal anacron[2783]: Will run job `cron.daily' in 32 min.
Jan 23 09:01:01 np0005593295.novalocal anacron[2783]: Will run job `cron.weekly' in 52 min.
Jan 23 09:01:01 np0005593295.novalocal anacron[2783]: Will run job `cron.monthly' in 72 min.
Jan 23 09:01:01 np0005593295.novalocal anacron[2783]: Jobs will be executed sequentially
Jan 23 09:01:01 np0005593295.novalocal run-parts[2786]: (/etc/cron.hourly) finished 0anacron
Jan 23 09:01:01 np0005593295.novalocal CROND[2747]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]:   microcode_ctl module: mangling fw_dir
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]:     microcode_ctl: configuration "intel" is ignored
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]: *** Including module: openssl ***
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]: *** Including module: shutdown ***
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]: *** Including module: squash ***
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]: *** Including modules done ***
Jan 23 09:01:01 np0005593295.novalocal dracut[1286]: *** Installing kernel module dependencies ***
Jan 23 09:01:02 np0005593295.novalocal dracut[1286]: *** Installing kernel module dependencies done ***
Jan 23 09:01:02 np0005593295.novalocal dracut[1286]: *** Resolving executable dependencies ***
Jan 23 09:01:03 np0005593295.novalocal irqbalance[780]: Cannot change IRQ 35 affinity: Operation not permitted
Jan 23 09:01:03 np0005593295.novalocal irqbalance[780]: IRQ 35 affinity is now unmanaged
Jan 23 09:01:03 np0005593295.novalocal irqbalance[780]: Cannot change IRQ 33 affinity: Operation not permitted
Jan 23 09:01:03 np0005593295.novalocal irqbalance[780]: IRQ 33 affinity is now unmanaged
Jan 23 09:01:03 np0005593295.novalocal irqbalance[780]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 23 09:01:03 np0005593295.novalocal irqbalance[780]: IRQ 31 affinity is now unmanaged
Jan 23 09:01:03 np0005593295.novalocal irqbalance[780]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 23 09:01:03 np0005593295.novalocal irqbalance[780]: IRQ 28 affinity is now unmanaged
Jan 23 09:01:03 np0005593295.novalocal irqbalance[780]: Cannot change IRQ 34 affinity: Operation not permitted
Jan 23 09:01:03 np0005593295.novalocal irqbalance[780]: IRQ 34 affinity is now unmanaged
Jan 23 09:01:03 np0005593295.novalocal irqbalance[780]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 23 09:01:03 np0005593295.novalocal irqbalance[780]: IRQ 32 affinity is now unmanaged
Jan 23 09:01:03 np0005593295.novalocal irqbalance[780]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 23 09:01:03 np0005593295.novalocal irqbalance[780]: IRQ 30 affinity is now unmanaged
Jan 23 09:01:03 np0005593295.novalocal irqbalance[780]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 23 09:01:03 np0005593295.novalocal irqbalance[780]: IRQ 29 affinity is now unmanaged
Jan 23 09:01:03 np0005593295.novalocal dracut[1286]: *** Resolving executable dependencies done ***
Jan 23 09:01:03 np0005593295.novalocal dracut[1286]: *** Generating early-microcode cpio image ***
Jan 23 09:01:03 np0005593295.novalocal dracut[1286]: *** Store current command line parameters ***
Jan 23 09:01:03 np0005593295.novalocal dracut[1286]: Stored kernel commandline:
Jan 23 09:01:03 np0005593295.novalocal dracut[1286]: No dracut internal kernel commandline stored in the initramfs
Jan 23 09:01:04 np0005593295.novalocal dracut[1286]: *** Install squash loader ***
Jan 23 09:01:04 np0005593295.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 09:01:05 np0005593295.novalocal dracut[1286]: *** Squashing the files inside the initramfs ***
Jan 23 09:01:06 np0005593295.novalocal dracut[1286]: *** Squashing the files inside the initramfs done ***
Jan 23 09:01:06 np0005593295.novalocal dracut[1286]: *** Creating image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' ***
Jan 23 09:01:06 np0005593295.novalocal dracut[1286]: *** Hardlinking files ***
Jan 23 09:01:06 np0005593295.novalocal dracut[1286]: Mode:           real
Jan 23 09:01:06 np0005593295.novalocal dracut[1286]: Files:          50
Jan 23 09:01:06 np0005593295.novalocal dracut[1286]: Linked:         0 files
Jan 23 09:01:06 np0005593295.novalocal dracut[1286]: Compared:       0 xattrs
Jan 23 09:01:06 np0005593295.novalocal dracut[1286]: Compared:       0 files
Jan 23 09:01:06 np0005593295.novalocal dracut[1286]: Saved:          0 B
Jan 23 09:01:06 np0005593295.novalocal dracut[1286]: Duration:       0.000618 seconds
Jan 23 09:01:06 np0005593295.novalocal dracut[1286]: *** Hardlinking files done ***
Jan 23 09:01:06 np0005593295.novalocal dracut[1286]: *** Creating initramfs image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' done ***
Jan 23 09:01:07 np0005593295.novalocal kdumpctl[1019]: kdump: kexec: loaded kdump kernel
Jan 23 09:01:07 np0005593295.novalocal kdumpctl[1019]: kdump: Starting kdump: [OK]
Jan 23 09:01:07 np0005593295.novalocal systemd[1]: Finished Crash recovery kernel arming.
Jan 23 09:01:07 np0005593295.novalocal systemd[1]: Startup finished in 1.630s (kernel) + 2.548s (initrd) + 15.629s (userspace) = 19.809s.
Jan 23 09:01:24 np0005593295.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 09:01:44 np0005593295.novalocal sshd-session[4321]: Accepted publickey for zuul from 38.102.83.114 port 55038 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Jan 23 09:01:44 np0005593295.novalocal systemd[1]: Created slice User Slice of UID 1000.
Jan 23 09:01:44 np0005593295.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 23 09:01:44 np0005593295.novalocal systemd-logind[786]: New session 1 of user zuul.
Jan 23 09:01:44 np0005593295.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 23 09:01:44 np0005593295.novalocal systemd[1]: Starting User Manager for UID 1000...
Jan 23 09:01:44 np0005593295.novalocal systemd[4325]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:01:44 np0005593295.novalocal systemd[4325]: Queued start job for default target Main User Target.
Jan 23 09:01:44 np0005593295.novalocal systemd[4325]: Created slice User Application Slice.
Jan 23 09:01:44 np0005593295.novalocal systemd[4325]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:01:44 np0005593295.novalocal systemd[4325]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 09:01:44 np0005593295.novalocal systemd[4325]: Reached target Paths.
Jan 23 09:01:44 np0005593295.novalocal systemd[4325]: Reached target Timers.
Jan 23 09:01:44 np0005593295.novalocal systemd[4325]: Starting D-Bus User Message Bus Socket...
Jan 23 09:01:44 np0005593295.novalocal systemd[4325]: Starting Create User's Volatile Files and Directories...
Jan 23 09:01:44 np0005593295.novalocal systemd[4325]: Finished Create User's Volatile Files and Directories.
Jan 23 09:01:44 np0005593295.novalocal systemd[4325]: Listening on D-Bus User Message Bus Socket.
Jan 23 09:01:44 np0005593295.novalocal systemd[4325]: Reached target Sockets.
Jan 23 09:01:44 np0005593295.novalocal systemd[4325]: Reached target Basic System.
Jan 23 09:01:44 np0005593295.novalocal systemd[4325]: Reached target Main User Target.
Jan 23 09:01:44 np0005593295.novalocal systemd[4325]: Startup finished in 125ms.
Jan 23 09:01:44 np0005593295.novalocal systemd[1]: Started User Manager for UID 1000.
Jan 23 09:01:44 np0005593295.novalocal systemd[1]: Started Session 1 of User zuul.
Jan 23 09:01:44 np0005593295.novalocal sshd-session[4321]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:01:45 np0005593295.novalocal python3[4407]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:01:49 np0005593295.novalocal python3[4435]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:01:55 np0005593295.novalocal python3[4493]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:01:56 np0005593295.novalocal python3[4533]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 23 09:01:58 np0005593295.novalocal python3[4559]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQChWBsfs5FtlYIS47KhLNXtsYVhP6UT/w4WYq1l1d/b7+cXPAwAb4Qt1cc/BmNcKM419a6D+CvPejxC67s0h4ksuceBjB/s6b88/zjf8Lio8Dd87f6J+f6IY8ByYIQ8s3Hvn6z0K7HSyEMuQ0B/CLxeBW4MJFqcoLK2v7Y8SNPGLr8w/8y79OWnJJPKmfM4ACTo2JwqmPGI/4+LQsCZS/p/yKDTO5AYxsIUwWw/IX3Jxs67UOBqa40onmgM/VRkfGY512fziVUNkmFHG2Aqgosbpbz/XysrVTpvLRA/H2zpGbbTbuEg6xp8vHQO5V0csAd6p3cdOixjdaPmf9oy3+yXuIeWwnnxPHqvVDY6N9aaIX4vuajxOoMUFiQ2YtcDq7sCn8HoateyYgIL/u2+pInArUiYGemyMEWja0DhD6UdCkY0Ea+YDWeIZKM505N+HClR5jfjjVW35TndY+AldV5OhOzMRmPjtJYS8a0usUXRvmxRfMFSmO9CI1RfNmod9X0= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:01:59 np0005593295.novalocal python3[4583]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:01:59 np0005593295.novalocal python3[4682]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:02:00 np0005593295.novalocal python3[4753]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769158919.5010436-254-168213695087426/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=79d1f7c5e92f4d57bb17665cf28be8d8_id_rsa follow=False checksum=70fc72f3adde7c23bd22f0e2ad4ebdd2e15c011a backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:00 np0005593295.novalocal python3[4876]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:02:01 np0005593295.novalocal python3[4947]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769158920.5483115-309-25150256250709/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=79d1f7c5e92f4d57bb17665cf28be8d8_id_rsa.pub follow=False checksum=1817e5216c13f90f69486a375706d090e99f2d79 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:02 np0005593295.novalocal python3[4995]: ansible-ping Invoked with data=pong
Jan 23 09:02:03 np0005593295.novalocal python3[5019]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:02:05 np0005593295.novalocal python3[5077]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 23 09:02:07 np0005593295.novalocal python3[5109]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:07 np0005593295.novalocal python3[5133]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:07 np0005593295.novalocal python3[5157]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:07 np0005593295.novalocal python3[5181]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:08 np0005593295.novalocal python3[5205]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:08 np0005593295.novalocal python3[5229]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:10 np0005593295.novalocal sudo[5253]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfjwueghwhokssncforyzyhkzjqaeade ; /usr/bin/python3'
Jan 23 09:02:10 np0005593295.novalocal sudo[5253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:10 np0005593295.novalocal python3[5255]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:10 np0005593295.novalocal sudo[5253]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:10 np0005593295.novalocal sudo[5331]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbwudmsnvqhvuvtwqkibacponynzdbaq ; /usr/bin/python3'
Jan 23 09:02:10 np0005593295.novalocal sudo[5331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:10 np0005593295.novalocal python3[5333]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:02:10 np0005593295.novalocal sudo[5331]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:11 np0005593295.novalocal sudo[5404]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqmpnofudqbwkpbflgaarnkfdzjwsngx ; /usr/bin/python3'
Jan 23 09:02:11 np0005593295.novalocal sudo[5404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:11 np0005593295.novalocal python3[5406]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158930.469577-34-237662813535663/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:11 np0005593295.novalocal sudo[5404]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:12 np0005593295.novalocal python3[5454]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:12 np0005593295.novalocal python3[5478]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:12 np0005593295.novalocal python3[5502]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:12 np0005593295.novalocal python3[5526]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:13 np0005593295.novalocal python3[5550]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:13 np0005593295.novalocal python3[5574]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:13 np0005593295.novalocal python3[5598]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:13 np0005593295.novalocal python3[5622]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:14 np0005593295.novalocal python3[5646]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:14 np0005593295.novalocal python3[5670]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:14 np0005593295.novalocal python3[5694]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:14 np0005593295.novalocal python3[5718]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:15 np0005593295.novalocal python3[5742]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:15 np0005593295.novalocal python3[5766]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:15 np0005593295.novalocal python3[5790]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:16 np0005593295.novalocal python3[5814]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:16 np0005593295.novalocal python3[5838]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:16 np0005593295.novalocal python3[5862]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:16 np0005593295.novalocal python3[5886]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:17 np0005593295.novalocal python3[5910]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:17 np0005593295.novalocal python3[5934]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:17 np0005593295.novalocal python3[5958]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:18 np0005593295.novalocal python3[5982]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:18 np0005593295.novalocal python3[6006]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:18 np0005593295.novalocal python3[6030]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:18 np0005593295.novalocal python3[6054]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:21 np0005593295.novalocal sudo[6078]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptckofzwkrpvjynoslpqdavaddpsfqna ; /usr/bin/python3'
Jan 23 09:02:21 np0005593295.novalocal sudo[6078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:21 np0005593295.novalocal python3[6080]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 23 09:02:21 np0005593295.novalocal systemd[1]: Starting Time & Date Service...
Jan 23 09:02:22 np0005593295.novalocal systemd[1]: Started Time & Date Service.
Jan 23 09:02:22 np0005593295.novalocal systemd-timedated[6082]: Changed time zone to 'UTC' (UTC).
Jan 23 09:02:22 np0005593295.novalocal sudo[6078]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:22 np0005593295.novalocal sudo[6109]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvztsehtphfkamrgxwbxnfywbgvhszsc ; /usr/bin/python3'
Jan 23 09:02:22 np0005593295.novalocal sudo[6109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:22 np0005593295.novalocal python3[6111]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:22 np0005593295.novalocal sudo[6109]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:22 np0005593295.novalocal python3[6187]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:02:23 np0005593295.novalocal python3[6258]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769158942.6668499-254-128377911828478/source _original_basename=tmp452_6lu8 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:23 np0005593295.novalocal python3[6358]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:02:24 np0005593295.novalocal python3[6429]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769158943.5654054-304-117808605601701/source _original_basename=tmpiaqmet1f follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:24 np0005593295.novalocal sudo[6529]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iojtihfkcawypykrnamccpzqrsojbfmo ; /usr/bin/python3'
Jan 23 09:02:24 np0005593295.novalocal sudo[6529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:24 np0005593295.novalocal python3[6531]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:02:24 np0005593295.novalocal sudo[6529]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:25 np0005593295.novalocal sudo[6602]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbsewedvchyvuokuyswkpjgtfwpdeiji ; /usr/bin/python3'
Jan 23 09:02:25 np0005593295.novalocal sudo[6602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:25 np0005593295.novalocal python3[6604]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769158944.7552545-384-176906075530815/source _original_basename=tmpgty3_bc_ follow=False checksum=8f68793d163f2a5535dcdbaa3731e7670c26af6c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:25 np0005593295.novalocal sudo[6602]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:26 np0005593295.novalocal python3[6652]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:02:26 np0005593295.novalocal python3[6678]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:02:27 np0005593295.novalocal sudo[6756]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woxeryowmfqdpglkicsyepbbzirtghdy ; /usr/bin/python3'
Jan 23 09:02:27 np0005593295.novalocal sudo[6756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:27 np0005593295.novalocal python3[6758]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:02:27 np0005593295.novalocal sudo[6756]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:27 np0005593295.novalocal sudo[6829]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdlvfdejbmjphltwtetrqvbhebuxrokn ; /usr/bin/python3'
Jan 23 09:02:27 np0005593295.novalocal sudo[6829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:27 np0005593295.novalocal python3[6831]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158947.3532264-454-71423814110502/source _original_basename=tmpdkzgjeqd follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:27 np0005593295.novalocal sudo[6829]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:28 np0005593295.novalocal sudo[6880]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfccpacvwhfwftxkpqkryjnonxamxtpu ; /usr/bin/python3'
Jan 23 09:02:28 np0005593295.novalocal sudo[6880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:28 np0005593295.novalocal python3[6882]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-639e-86bd-00000000001f-1-compute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:02:28 np0005593295.novalocal sudo[6880]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:29 np0005593295.novalocal python3[6910]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-639e-86bd-000000000020-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 23 09:02:30 np0005593295.novalocal python3[6938]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:48 np0005593295.novalocal sudo[6962]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djycvfvjdcvxanxrhcixkonjnpmmavdg ; /usr/bin/python3'
Jan 23 09:02:48 np0005593295.novalocal sudo[6962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:48 np0005593295.novalocal python3[6964]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:48 np0005593295.novalocal sudo[6962]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:52 np0005593295.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 23 09:03:46 np0005593295.novalocal systemd[4325]: Starting Mark boot as successful...
Jan 23 09:03:46 np0005593295.novalocal systemd[4325]: Finished Mark boot as successful.
Jan 23 09:03:48 np0005593295.novalocal sshd-session[4334]: Received disconnect from 38.102.83.114 port 55038:11: disconnected by user
Jan 23 09:03:48 np0005593295.novalocal sshd-session[4334]: Disconnected from user zuul 38.102.83.114 port 55038
Jan 23 09:03:48 np0005593295.novalocal sshd-session[4321]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:03:48 np0005593295.novalocal systemd-logind[786]: Session 1 logged out. Waiting for processes to exit.
Jan 23 09:04:18 np0005593295.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 23 09:04:18 np0005593295.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 23 09:04:18 np0005593295.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 23 09:04:18 np0005593295.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 23 09:04:18 np0005593295.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 23 09:04:18 np0005593295.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 23 09:04:18 np0005593295.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 23 09:04:18 np0005593295.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 23 09:04:18 np0005593295.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 23 09:04:18 np0005593295.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 23 09:04:18 np0005593295.novalocal NetworkManager[858]: <info>  [1769159058.4490] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 23 09:04:18 np0005593295.novalocal systemd-udevd[6969]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:04:18 np0005593295.novalocal NetworkManager[858]: <info>  [1769159058.4637] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:04:18 np0005593295.novalocal NetworkManager[858]: <info>  [1769159058.4661] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 23 09:04:18 np0005593295.novalocal NetworkManager[858]: <info>  [1769159058.4663] device (eth1): carrier: link connected
Jan 23 09:04:18 np0005593295.novalocal NetworkManager[858]: <info>  [1769159058.4665] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 23 09:04:18 np0005593295.novalocal NetworkManager[858]: <info>  [1769159058.4671] policy: auto-activating connection 'Wired connection 1' (52728e87-b91d-3812-9239-09489880e5d3)
Jan 23 09:04:18 np0005593295.novalocal NetworkManager[858]: <info>  [1769159058.4675] device (eth1): Activation: starting connection 'Wired connection 1' (52728e87-b91d-3812-9239-09489880e5d3)
Jan 23 09:04:18 np0005593295.novalocal NetworkManager[858]: <info>  [1769159058.4675] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:04:18 np0005593295.novalocal NetworkManager[858]: <info>  [1769159058.4678] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:04:18 np0005593295.novalocal NetworkManager[858]: <info>  [1769159058.4681] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:04:18 np0005593295.novalocal NetworkManager[858]: <info>  [1769159058.4685] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 23 09:04:19 np0005593295.novalocal sshd-session[6972]: Accepted publickey for zuul from 38.102.83.114 port 51794 ssh2: RSA SHA256:/TrmfiPCpRhp7iDH6L+XY56Icv2RRStSYrCVh8OnXTQ
Jan 23 09:04:19 np0005593295.novalocal systemd-logind[786]: New session 3 of user zuul.
Jan 23 09:04:19 np0005593295.novalocal systemd[1]: Started Session 3 of User zuul.
Jan 23 09:04:19 np0005593295.novalocal sshd-session[6972]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:04:19 np0005593295.novalocal python3[6999]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-4543-3693-0000000001f6-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:04:29 np0005593295.novalocal sudo[7077]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hldmlaaehixebssdzsfuyeljernsnkfh ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 23 09:04:29 np0005593295.novalocal sudo[7077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:29 np0005593295.novalocal python3[7079]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:04:29 np0005593295.novalocal sudo[7077]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:29 np0005593295.novalocal sudo[7150]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-taxpvemyzuyaiwebozvcjtpaisuyfosq ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 23 09:04:29 np0005593295.novalocal sudo[7150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:29 np0005593295.novalocal python3[7152]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769159069.1411424-206-135901253566715/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=9da3c4bf865aebe0db6de516128830b2cb557851 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:04:29 np0005593295.novalocal sudo[7150]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:29 np0005593295.novalocal sudo[7200]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xomsewpvswwghqspbdqnebaxmpnnunpa ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 23 09:04:29 np0005593295.novalocal sudo[7200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:30 np0005593295.novalocal python3[7202]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:04:30 np0005593295.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 23 09:04:30 np0005593295.novalocal systemd[1]: Stopped Network Manager Wait Online.
Jan 23 09:04:30 np0005593295.novalocal systemd[1]: Stopping Network Manager Wait Online...
Jan 23 09:04:30 np0005593295.novalocal NetworkManager[858]: <info>  [1769159070.2515] caught SIGTERM, shutting down normally.
Jan 23 09:04:30 np0005593295.novalocal systemd[1]: Stopping Network Manager...
Jan 23 09:04:30 np0005593295.novalocal NetworkManager[858]: <info>  [1769159070.2523] dhcp4 (eth0): canceled DHCP transaction
Jan 23 09:04:30 np0005593295.novalocal NetworkManager[858]: <info>  [1769159070.2525] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 09:04:30 np0005593295.novalocal NetworkManager[858]: <info>  [1769159070.2525] dhcp4 (eth0): state changed no lease
Jan 23 09:04:30 np0005593295.novalocal NetworkManager[858]: <info>  [1769159070.2526] manager: NetworkManager state is now CONNECTING
Jan 23 09:04:30 np0005593295.novalocal NetworkManager[858]: <info>  [1769159070.2588] dhcp4 (eth1): canceled DHCP transaction
Jan 23 09:04:30 np0005593295.novalocal NetworkManager[858]: <info>  [1769159070.2589] dhcp4 (eth1): state changed no lease
Jan 23 09:04:30 np0005593295.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 09:04:30 np0005593295.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 09:04:40 np0005593295.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 09:04:43 np0005593295.novalocal NetworkManager[858]: <info>  [1769159083.9221] exiting (success)
Jan 23 09:04:43 np0005593295.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 23 09:04:43 np0005593295.novalocal systemd[1]: Stopped Network Manager.
Jan 23 09:04:43 np0005593295.novalocal systemd[1]: NetworkManager.service: Consumed 1.322s CPU time, 9.9M memory peak.
Jan 23 09:04:43 np0005593295.novalocal systemd[1]: Starting Network Manager...
Jan 23 09:04:43 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159083.9833] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:20df1b08-a5ba-4a35-8d47-00aa8e9b2616)
Jan 23 09:04:43 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159083.9834] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 23 09:04:43 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159083.9884] manager[0x55f8e0457000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 23 09:04:44 np0005593295.novalocal systemd[1]: Starting Hostname Service...
Jan 23 09:04:44 np0005593295.novalocal systemd[1]: Started Hostname Service.
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0595] hostname: hostname: using hostnamed
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0596] hostname: static hostname changed from (none) to "np0005593295.novalocal"
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0603] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0609] manager[0x55f8e0457000]: rfkill: Wi-Fi hardware radio set enabled
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0610] manager[0x55f8e0457000]: rfkill: WWAN hardware radio set enabled
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0651] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0652] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0653] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0653] manager: Networking is enabled by state file
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0657] settings: Loaded settings plugin: keyfile (internal)
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0663] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0711] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0728] dhcp: init: Using DHCP client 'internal'
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0732] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0742] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0752] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0765] device (lo): Activation: starting connection 'lo' (a94dd518-f501-4cf9-bb13-731d2edd38ea)
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0777] device (eth0): carrier: link connected
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0784] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0793] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0794] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0806] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0818] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0828] device (eth1): carrier: link connected
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0834] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0843] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (52728e87-b91d-3812-9239-09489880e5d3) (indicated)
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0844] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0854] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0865] device (eth1): Activation: starting connection 'Wired connection 1' (52728e87-b91d-3812-9239-09489880e5d3)
Jan 23 09:04:44 np0005593295.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0876] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 23 09:04:44 np0005593295.novalocal systemd[1]: Started Network Manager.
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0884] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0891] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0894] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0899] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0906] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0911] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0915] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0921] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0933] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0939] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0954] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 09:04:44 np0005593295.novalocal systemd[1]: Starting Network Manager Wait Online...
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0959] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 23 09:04:44 np0005593295.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.0987] dhcp4 (eth0): state changed new lease, address=38.129.56.185
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.1003] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 23 09:04:44 np0005593295.novalocal sudo[7200]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:44 np0005593295.novalocal python3[7264]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-4543-3693-0000000000d3-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.8408] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.8435] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.8445] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.8451] device (lo): Activation: successful, device activated.
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.8503] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.8510] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.8519] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.8523] device (eth0): Activation: successful, device activated.
Jan 23 09:04:44 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159084.8531] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 23 09:04:54 np0005593295.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 09:05:14 np0005593295.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 09:05:29 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159129.2911] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 09:05:29 np0005593295.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 09:05:29 np0005593295.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 09:05:29 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159129.3204] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 09:05:29 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159129.3207] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 09:05:29 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159129.3211] device (eth1): Activation: successful, device activated.
Jan 23 09:05:29 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159129.3216] manager: startup complete
Jan 23 09:05:29 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159129.3219] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 23 09:05:29 np0005593295.novalocal NetworkManager[7219]: <warn>  [1769159129.3222] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 23 09:05:29 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159129.3228] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 23 09:05:29 np0005593295.novalocal systemd[1]: Finished Network Manager Wait Online.
Jan 23 09:05:29 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159129.3318] dhcp4 (eth1): canceled DHCP transaction
Jan 23 09:05:29 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159129.3318] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 23 09:05:29 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159129.3318] dhcp4 (eth1): state changed no lease
Jan 23 09:05:29 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159129.3330] policy: auto-activating connection 'ci-private-network' (8b069e9e-bd63-5e9d-bdd1-b5c43b66b918)
Jan 23 09:05:29 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159129.3334] device (eth1): Activation: starting connection 'ci-private-network' (8b069e9e-bd63-5e9d-bdd1-b5c43b66b918)
Jan 23 09:05:29 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159129.3335] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:05:29 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159129.3337] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:05:29 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159129.3343] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:05:29 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159129.3349] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:05:29 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159129.3462] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:05:29 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159129.3464] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:05:29 np0005593295.novalocal NetworkManager[7219]: <info>  [1769159129.3468] device (eth1): Activation: successful, device activated.
Jan 23 09:05:39 np0005593295.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 09:05:44 np0005593295.novalocal sshd-session[6975]: Received disconnect from 38.102.83.114 port 51794:11: disconnected by user
Jan 23 09:05:44 np0005593295.novalocal sshd-session[6975]: Disconnected from user zuul 38.102.83.114 port 51794
Jan 23 09:05:44 np0005593295.novalocal sshd-session[6972]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:05:44 np0005593295.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Jan 23 09:05:44 np0005593295.novalocal systemd[1]: session-3.scope: Consumed 1.378s CPU time.
Jan 23 09:05:44 np0005593295.novalocal systemd-logind[786]: Session 3 logged out. Waiting for processes to exit.
Jan 23 09:05:44 np0005593295.novalocal systemd-logind[786]: Removed session 3.
Jan 23 09:05:54 np0005593295.novalocal sshd-session[7318]: Accepted publickey for zuul from 38.102.83.114 port 36806 ssh2: RSA SHA256:/TrmfiPCpRhp7iDH6L+XY56Icv2RRStSYrCVh8OnXTQ
Jan 23 09:05:54 np0005593295.novalocal systemd-logind[786]: New session 4 of user zuul.
Jan 23 09:05:54 np0005593295.novalocal systemd[1]: Started Session 4 of User zuul.
Jan 23 09:05:54 np0005593295.novalocal sshd-session[7318]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:05:54 np0005593295.novalocal sudo[7397]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmqjqnqrknobjleoyakixsoddrecmtqk ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 23 09:05:54 np0005593295.novalocal sudo[7397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:54 np0005593295.novalocal python3[7399]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:05:54 np0005593295.novalocal sudo[7397]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:55 np0005593295.novalocal sudo[7470]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffawotaamyyiucmfwtehtqkqmnssnhve ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 23 09:05:55 np0005593295.novalocal sudo[7470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:55 np0005593295.novalocal python3[7472]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159154.5039935-373-168282419267218/source _original_basename=tmpv125ucd2 follow=False checksum=6e1e8970cf6ad2f0b1a32d462d71e8a0528ec2d9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:55 np0005593295.novalocal sudo[7470]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:57 np0005593295.novalocal sshd-session[7321]: Connection closed by 38.102.83.114 port 36806
Jan 23 09:05:57 np0005593295.novalocal sshd-session[7318]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:05:57 np0005593295.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Jan 23 09:05:57 np0005593295.novalocal systemd-logind[786]: Session 4 logged out. Waiting for processes to exit.
Jan 23 09:05:57 np0005593295.novalocal systemd-logind[786]: Removed session 4.
Jan 23 09:06:46 np0005593295.novalocal systemd[4325]: Created slice User Background Tasks Slice.
Jan 23 09:06:46 np0005593295.novalocal systemd[4325]: Starting Cleanup of User's Temporary Files and Directories...
Jan 23 09:06:46 np0005593295.novalocal systemd[4325]: Finished Cleanup of User's Temporary Files and Directories.
Jan 23 09:14:49 np0005593295.novalocal sshd-session[7504]: Accepted publickey for zuul from 38.102.83.114 port 50422 ssh2: RSA SHA256:/TrmfiPCpRhp7iDH6L+XY56Icv2RRStSYrCVh8OnXTQ
Jan 23 09:14:49 np0005593295.novalocal systemd-logind[786]: New session 5 of user zuul.
Jan 23 09:14:49 np0005593295.novalocal systemd[1]: Started Session 5 of User zuul.
Jan 23 09:14:49 np0005593295.novalocal sshd-session[7504]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:14:49 np0005593295.novalocal sudo[7531]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vepytqcyofeotriypqumiceebhsesehy ; /usr/bin/python3'
Jan 23 09:14:49 np0005593295.novalocal sudo[7531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:50 np0005593295.novalocal python3[7533]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-5353-1fb2-00000000217f-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:14:50 np0005593295.novalocal sudo[7531]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:50 np0005593295.novalocal sudo[7559]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odbxrboujeipzvufuqwpawevckooheqc ; /usr/bin/python3'
Jan 23 09:14:50 np0005593295.novalocal sudo[7559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:50 np0005593295.novalocal python3[7561]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:14:50 np0005593295.novalocal sudo[7559]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:50 np0005593295.novalocal sudo[7586]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcvaksqqhhmscwmmcwcwmdpsdbmplcxs ; /usr/bin/python3'
Jan 23 09:14:50 np0005593295.novalocal sudo[7586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:50 np0005593295.novalocal python3[7588]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:14:50 np0005593295.novalocal sudo[7586]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:50 np0005593295.novalocal sudo[7612]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rojbmeufczvfhrnhgcffstueaarcobfq ; /usr/bin/python3'
Jan 23 09:14:50 np0005593295.novalocal sudo[7612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:51 np0005593295.novalocal python3[7614]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:14:51 np0005593295.novalocal sudo[7612]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:51 np0005593295.novalocal sudo[7638]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwlfdwvcmozuhouldmdkfuyvkytwcunb ; /usr/bin/python3'
Jan 23 09:14:51 np0005593295.novalocal sudo[7638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:51 np0005593295.novalocal python3[7640]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:14:51 np0005593295.novalocal sudo[7638]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:51 np0005593295.novalocal sudo[7664]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxfzwiywlqgkrpccymozpmbdsohmhlfj ; /usr/bin/python3'
Jan 23 09:14:51 np0005593295.novalocal sudo[7664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:52 np0005593295.novalocal python3[7666]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:14:52 np0005593295.novalocal sudo[7664]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:52 np0005593295.novalocal sudo[7742]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugsrweisjmyzrokwbgxupstigeiqlbpo ; /usr/bin/python3'
Jan 23 09:14:52 np0005593295.novalocal sudo[7742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:52 np0005593295.novalocal python3[7744]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:14:52 np0005593295.novalocal sudo[7742]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:52 np0005593295.novalocal sudo[7815]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlbvbuetaexuoqnoymbfhuilchkdmahe ; /usr/bin/python3'
Jan 23 09:14:52 np0005593295.novalocal sudo[7815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:53 np0005593295.novalocal python3[7817]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159692.5070572-547-243315707134321/source _original_basename=tmprhbvswkc follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:14:53 np0005593295.novalocal sudo[7815]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:54 np0005593295.novalocal sudo[7865]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-funocjakfoizgviwrerhrftewsnowocz ; /usr/bin/python3'
Jan 23 09:14:54 np0005593295.novalocal sudo[7865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:54 np0005593295.novalocal python3[7867]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 09:14:54 np0005593295.novalocal systemd[1]: Reloading.
Jan 23 09:14:54 np0005593295.novalocal systemd-rc-local-generator[7886]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:14:54 np0005593295.novalocal sudo[7865]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:56 np0005593295.novalocal sudo[7921]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qytwetxyagkwbsuffaktrsfpyqodwqja ; /usr/bin/python3'
Jan 23 09:14:56 np0005593295.novalocal sudo[7921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:56 np0005593295.novalocal python3[7923]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 23 09:14:56 np0005593295.novalocal sudo[7921]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:56 np0005593295.novalocal sudo[7947]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unrsdxryzsszwuprmugmvojlzxxpqdae ; /usr/bin/python3'
Jan 23 09:14:56 np0005593295.novalocal sudo[7947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:56 np0005593295.novalocal python3[7949]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:14:56 np0005593295.novalocal sudo[7947]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:57 np0005593295.novalocal sudo[7975]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrtwbirzxveblonoqehyomzgjnultann ; /usr/bin/python3'
Jan 23 09:14:57 np0005593295.novalocal sudo[7975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:57 np0005593295.novalocal python3[7977]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:14:57 np0005593295.novalocal sudo[7975]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:57 np0005593295.novalocal sudo[8003]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjtucduqpumtyfqftyotnjthianzyeni ; /usr/bin/python3'
Jan 23 09:14:57 np0005593295.novalocal sudo[8003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:57 np0005593295.novalocal python3[8005]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:14:57 np0005593295.novalocal sudo[8003]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:57 np0005593295.novalocal sudo[8031]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-senhfxmuyoniixgtzvlfhndoighcasad ; /usr/bin/python3'
Jan 23 09:14:57 np0005593295.novalocal sudo[8031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:57 np0005593295.novalocal python3[8033]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:14:57 np0005593295.novalocal sudo[8031]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:58 np0005593295.novalocal python3[8060]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-5353-1fb2-000000002186-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:14:59 np0005593295.novalocal python3[8090]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 23 09:15:03 np0005593295.novalocal sshd-session[7507]: Connection closed by 38.102.83.114 port 50422
Jan 23 09:15:03 np0005593295.novalocal sshd-session[7504]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:15:03 np0005593295.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Jan 23 09:15:03 np0005593295.novalocal systemd[1]: session-5.scope: Consumed 3.841s CPU time.
Jan 23 09:15:03 np0005593295.novalocal systemd-logind[786]: Session 5 logged out. Waiting for processes to exit.
Jan 23 09:15:03 np0005593295.novalocal systemd-logind[786]: Removed session 5.
Jan 23 09:15:05 np0005593295.novalocal sshd-session[8095]: Accepted publickey for zuul from 38.102.83.114 port 52072 ssh2: RSA SHA256:/TrmfiPCpRhp7iDH6L+XY56Icv2RRStSYrCVh8OnXTQ
Jan 23 09:15:05 np0005593295.novalocal systemd-logind[786]: New session 6 of user zuul.
Jan 23 09:15:05 np0005593295.novalocal systemd[1]: Started Session 6 of User zuul.
Jan 23 09:15:05 np0005593295.novalocal sshd-session[8095]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:15:05 np0005593295.novalocal sudo[8122]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzkgaomymhxbcmgekosbvhejyzzwcznx ; /usr/bin/python3'
Jan 23 09:15:05 np0005593295.novalocal sudo[8122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:15:05 np0005593295.novalocal python3[8124]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 23 09:15:14 np0005593295.novalocal setsebool[8166]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 23 09:15:14 np0005593295.novalocal setsebool[8166]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 23 09:15:30 np0005593295.novalocal kernel: SELinux:  Converting 386 SID table entries...
Jan 23 09:15:30 np0005593295.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 09:15:30 np0005593295.novalocal kernel: SELinux:  policy capability open_perms=1
Jan 23 09:15:30 np0005593295.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 09:15:30 np0005593295.novalocal kernel: SELinux:  policy capability always_check_network=0
Jan 23 09:15:30 np0005593295.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 09:15:30 np0005593295.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 09:15:30 np0005593295.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 09:15:43 np0005593295.novalocal kernel: SELinux:  Converting 389 SID table entries...
Jan 23 09:15:43 np0005593295.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 09:15:43 np0005593295.novalocal kernel: SELinux:  policy capability open_perms=1
Jan 23 09:15:43 np0005593295.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 09:15:43 np0005593295.novalocal kernel: SELinux:  policy capability always_check_network=0
Jan 23 09:15:43 np0005593295.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 09:15:43 np0005593295.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 09:15:43 np0005593295.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 09:16:00 np0005593295.novalocal dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 23 09:16:00 np0005593295.novalocal systemd[1]: Starting Cleanup of Temporary Directories...
Jan 23 09:16:00 np0005593295.novalocal systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 23 09:16:00 np0005593295.novalocal systemd[1]: Finished Cleanup of Temporary Directories.
Jan 23 09:16:00 np0005593295.novalocal systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 23 09:16:00 np0005593295.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 09:16:01 np0005593295.novalocal systemd[1]: Starting man-db-cache-update.service...
Jan 23 09:16:01 np0005593295.novalocal systemd[1]: Reloading.
Jan 23 09:16:01 np0005593295.novalocal systemd-rc-local-generator[8939]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:16:01 np0005593295.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 09:16:02 np0005593295.novalocal sudo[8122]: pam_unix(sudo:session): session closed for user root
Jan 23 09:16:15 np0005593295.novalocal python3[17841]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-f136-f057-00000000000c-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:16:18 np0005593295.novalocal kernel: evm: overlay not supported
Jan 23 09:16:18 np0005593295.novalocal systemd[4325]: Starting D-Bus User Message Bus...
Jan 23 09:16:18 np0005593295.novalocal dbus-broker-launch[18385]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 23 09:16:18 np0005593295.novalocal dbus-broker-launch[18385]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 23 09:16:18 np0005593295.novalocal systemd[4325]: Started D-Bus User Message Bus.
Jan 23 09:16:18 np0005593295.novalocal dbus-broker-lau[18385]: Ready
Jan 23 09:16:18 np0005593295.novalocal systemd[4325]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 23 09:16:18 np0005593295.novalocal systemd[4325]: Created slice Slice /user.
Jan 23 09:16:18 np0005593295.novalocal systemd[4325]: podman-18317.scope: unit configures an IP firewall, but not running as root.
Jan 23 09:16:18 np0005593295.novalocal systemd[4325]: (This warning is only shown for the first unit using IP firewalling.)
Jan 23 09:16:18 np0005593295.novalocal systemd[4325]: Started podman-18317.scope.
Jan 23 09:16:18 np0005593295.novalocal systemd[4325]: Started podman-pause-cc5e626e.scope.
Jan 23 09:16:20 np0005593295.novalocal sshd-session[8098]: Connection closed by 38.102.83.114 port 52072
Jan 23 09:16:20 np0005593295.novalocal sshd-session[8095]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:16:20 np0005593295.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Jan 23 09:16:20 np0005593295.novalocal systemd[1]: session-6.scope: Consumed 48.066s CPU time.
Jan 23 09:16:20 np0005593295.novalocal systemd-logind[786]: Session 6 logged out. Waiting for processes to exit.
Jan 23 09:16:20 np0005593295.novalocal systemd-logind[786]: Removed session 6.
Jan 23 09:16:37 np0005593295.novalocal sshd-session[26600]: Unable to negotiate with 38.129.56.17 port 48208: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Jan 23 09:16:37 np0005593295.novalocal sshd-session[26602]: Unable to negotiate with 38.129.56.17 port 48228: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Jan 23 09:16:37 np0005593295.novalocal sshd-session[26606]: Connection closed by 38.129.56.17 port 48192 [preauth]
Jan 23 09:16:37 np0005593295.novalocal sshd-session[26607]: Connection closed by 38.129.56.17 port 48194 [preauth]
Jan 23 09:16:37 np0005593295.novalocal sshd-session[26604]: Unable to negotiate with 38.129.56.17 port 48222: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Jan 23 09:16:41 np0005593295.novalocal sshd-session[28140]: Accepted publickey for zuul from 38.102.83.114 port 44542 ssh2: RSA SHA256:/TrmfiPCpRhp7iDH6L+XY56Icv2RRStSYrCVh8OnXTQ
Jan 23 09:16:41 np0005593295.novalocal systemd-logind[786]: New session 7 of user zuul.
Jan 23 09:16:41 np0005593295.novalocal systemd[1]: Started Session 7 of User zuul.
Jan 23 09:16:41 np0005593295.novalocal sshd-session[28140]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:16:41 np0005593295.novalocal python3[28237]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIXU6aMT27gF+Yfs/YZWwo3YepWSGuQLHNXTSuo3za5wTzqiDdK4Z0aI/Vfz5yHXRMPrH9UNJkm8FGQwkK4yHMQ= zuul@np0005593292.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:16:42 np0005593295.novalocal sudo[28461]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnxlbzdhxvnteswmexaximkuxfmcoozt ; /usr/bin/python3'
Jan 23 09:16:42 np0005593295.novalocal sudo[28461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:16:42 np0005593295.novalocal python3[28471]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIXU6aMT27gF+Yfs/YZWwo3YepWSGuQLHNXTSuo3za5wTzqiDdK4Z0aI/Vfz5yHXRMPrH9UNJkm8FGQwkK4yHMQ= zuul@np0005593292.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:16:42 np0005593295.novalocal sudo[28461]: pam_unix(sudo:session): session closed for user root
Jan 23 09:16:43 np0005593295.novalocal sudo[28885]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvadxmkkoksodprufcxeuhrvfpsdcevk ; /usr/bin/python3'
Jan 23 09:16:43 np0005593295.novalocal sudo[28885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:16:43 np0005593295.novalocal python3[28899]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005593295.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 23 09:16:43 np0005593295.novalocal useradd[28968]: new group: name=cloud-admin, GID=1002
Jan 23 09:16:43 np0005593295.novalocal useradd[28968]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Jan 23 09:16:43 np0005593295.novalocal sudo[28885]: pam_unix(sudo:session): session closed for user root
Jan 23 09:16:44 np0005593295.novalocal sudo[29490]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvymifgxdoeemdishyehiisuyewgxfrl ; /usr/bin/python3'
Jan 23 09:16:44 np0005593295.novalocal sudo[29490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:16:44 np0005593295.novalocal python3[29500]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIXU6aMT27gF+Yfs/YZWwo3YepWSGuQLHNXTSuo3za5wTzqiDdK4Z0aI/Vfz5yHXRMPrH9UNJkm8FGQwkK4yHMQ= zuul@np0005593292.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:16:44 np0005593295.novalocal sudo[29490]: pam_unix(sudo:session): session closed for user root
Jan 23 09:16:44 np0005593295.novalocal sudo[29773]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfmlfvnrvguiccyvlzctfmbhrhflbfdx ; /usr/bin/python3'
Jan 23 09:16:44 np0005593295.novalocal sudo[29773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:16:44 np0005593295.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 09:16:44 np0005593295.novalocal systemd[1]: Finished man-db-cache-update.service.
Jan 23 09:16:44 np0005593295.novalocal systemd[1]: man-db-cache-update.service: Consumed 51.162s CPU time.
Jan 23 09:16:44 np0005593295.novalocal systemd[1]: run-r160dbfaf61024b97a14b61df3240348e.service: Deactivated successfully.
Jan 23 09:16:45 np0005593295.novalocal python3[29782]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:16:45 np0005593295.novalocal sudo[29773]: pam_unix(sudo:session): session closed for user root
Jan 23 09:16:45 np0005593295.novalocal sudo[29854]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhvfcfrlaqnkyinlaosowwjfcpkjtlgl ; /usr/bin/python3'
Jan 23 09:16:45 np0005593295.novalocal sudo[29854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:16:45 np0005593295.novalocal python3[29856]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159804.773685-153-263573844720942/source _original_basename=tmpjkqig1x_ follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:16:45 np0005593295.novalocal sudo[29854]: pam_unix(sudo:session): session closed for user root
Jan 23 09:16:46 np0005593295.novalocal sudo[29904]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzrlfbvyvfofpeodjzxsjemlznazyarz ; /usr/bin/python3'
Jan 23 09:16:46 np0005593295.novalocal sudo[29904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:16:46 np0005593295.novalocal python3[29906]: ansible-ansible.builtin.hostname Invoked with name=compute-2 use=systemd
Jan 23 09:16:46 np0005593295.novalocal systemd[1]: Starting Hostname Service...
Jan 23 09:16:46 np0005593295.novalocal systemd[1]: Started Hostname Service.
Jan 23 09:16:46 np0005593295.novalocal systemd-hostnamed[29910]: Changed pretty hostname to 'compute-2'
Jan 23 09:16:46 compute-2 systemd-hostnamed[29910]: Hostname set to <compute-2> (static)
Jan 23 09:16:46 compute-2 NetworkManager[7219]: <info>  [1769159806.5835] hostname: static hostname changed from "np0005593295.novalocal" to "compute-2"
Jan 23 09:16:46 compute-2 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 09:16:46 compute-2 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 09:16:46 compute-2 sudo[29904]: pam_unix(sudo:session): session closed for user root
Jan 23 09:16:47 compute-2 sshd-session[28184]: Connection closed by 38.102.83.114 port 44542
Jan 23 09:16:47 compute-2 sshd-session[28140]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:16:47 compute-2 systemd[1]: session-7.scope: Deactivated successfully.
Jan 23 09:16:47 compute-2 systemd-logind[786]: Session 7 logged out. Waiting for processes to exit.
Jan 23 09:16:47 compute-2 systemd[1]: session-7.scope: Consumed 2.141s CPU time.
Jan 23 09:16:47 compute-2 systemd-logind[786]: Removed session 7.
Jan 23 09:16:56 compute-2 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 09:17:16 compute-2 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 09:21:46 compute-2 sshd-session[29932]: Accepted publickey for zuul from 38.129.56.17 port 54946 ssh2: RSA SHA256:/TrmfiPCpRhp7iDH6L+XY56Icv2RRStSYrCVh8OnXTQ
Jan 23 09:21:46 compute-2 systemd-logind[786]: New session 8 of user zuul.
Jan 23 09:21:46 compute-2 systemd[1]: Started Session 8 of User zuul.
Jan 23 09:21:46 compute-2 sshd-session[29932]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:21:46 compute-2 python3[30008]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:21:48 compute-2 sudo[30122]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luvjliarusldpycblecuqfamlrvbxdqu ; /usr/bin/python3'
Jan 23 09:21:48 compute-2 sudo[30122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:48 compute-2 python3[30124]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:21:48 compute-2 sudo[30122]: pam_unix(sudo:session): session closed for user root
Jan 23 09:21:49 compute-2 sudo[30195]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eueonepnletvyxzghxclazfaulyajyhf ; /usr/bin/python3'
Jan 23 09:21:49 compute-2 sudo[30195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:49 compute-2 python3[30197]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.7014406-34066-141547719505615/source mode=0755 _original_basename=delorean.repo follow=False checksum=0f7c85cc67bf467c48edf98d5acc63e62d808324 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:21:49 compute-2 sudo[30195]: pam_unix(sudo:session): session closed for user root
Jan 23 09:21:49 compute-2 sudo[30221]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdufpyupbgcmsgaimdcduhzgysxmljlo ; /usr/bin/python3'
Jan 23 09:21:49 compute-2 sudo[30221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:49 compute-2 python3[30223]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:21:49 compute-2 sudo[30221]: pam_unix(sudo:session): session closed for user root
Jan 23 09:21:49 compute-2 sudo[30294]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuioeyknyryyikpbvwqlelpfpajypeis ; /usr/bin/python3'
Jan 23 09:21:49 compute-2 sudo[30294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:50 compute-2 python3[30296]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.7014406-34066-141547719505615/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:21:50 compute-2 sudo[30294]: pam_unix(sudo:session): session closed for user root
Jan 23 09:21:50 compute-2 sudo[30320]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goygxaagxizugdyygqzghnolmpmftrmi ; /usr/bin/python3'
Jan 23 09:21:50 compute-2 sudo[30320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:50 compute-2 python3[30322]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:21:50 compute-2 sudo[30320]: pam_unix(sudo:session): session closed for user root
Jan 23 09:21:50 compute-2 sudo[30393]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdwosdghwbgjenkotuzbtctkxeqearec ; /usr/bin/python3'
Jan 23 09:21:50 compute-2 sudo[30393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:50 compute-2 python3[30395]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.7014406-34066-141547719505615/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:21:50 compute-2 sudo[30393]: pam_unix(sudo:session): session closed for user root
Jan 23 09:21:50 compute-2 sudo[30419]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idxdtzrvnqvnfgspsffwpumefrgnrpmp ; /usr/bin/python3'
Jan 23 09:21:50 compute-2 sudo[30419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:50 compute-2 python3[30421]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:21:50 compute-2 sudo[30419]: pam_unix(sudo:session): session closed for user root
Jan 23 09:21:51 compute-2 sudo[30492]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjacgsvrmgzfuwhbziqpfxfxiwirwutx ; /usr/bin/python3'
Jan 23 09:21:51 compute-2 sudo[30492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:51 compute-2 python3[30494]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.7014406-34066-141547719505615/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:21:51 compute-2 sudo[30492]: pam_unix(sudo:session): session closed for user root
Jan 23 09:21:51 compute-2 sudo[30518]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiqsddkcakkcneatnmqgalewsvrmyoax ; /usr/bin/python3'
Jan 23 09:21:51 compute-2 sudo[30518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:51 compute-2 python3[30520]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:21:51 compute-2 sudo[30518]: pam_unix(sudo:session): session closed for user root
Jan 23 09:21:51 compute-2 sudo[30591]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmbinaovcscxsyoesfxcspsyxgoodicr ; /usr/bin/python3'
Jan 23 09:21:51 compute-2 sudo[30591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:51 compute-2 python3[30593]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.7014406-34066-141547719505615/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:21:51 compute-2 sudo[30591]: pam_unix(sudo:session): session closed for user root
Jan 23 09:21:51 compute-2 sudo[30617]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pueuhdspumvcirggzyuuooacmikbjsgr ; /usr/bin/python3'
Jan 23 09:21:51 compute-2 sudo[30617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:52 compute-2 python3[30619]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:21:52 compute-2 sudo[30617]: pam_unix(sudo:session): session closed for user root
Jan 23 09:21:52 compute-2 sudo[30690]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekxwdjmzwqgrwwgzsfljlgvzkdbffwke ; /usr/bin/python3'
Jan 23 09:21:52 compute-2 sudo[30690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:52 compute-2 python3[30692]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.7014406-34066-141547719505615/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:21:52 compute-2 sudo[30690]: pam_unix(sudo:session): session closed for user root
Jan 23 09:21:52 compute-2 sudo[30716]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygrrxnwllohnyiwsztyyshmerfzgvsra ; /usr/bin/python3'
Jan 23 09:21:52 compute-2 sudo[30716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:52 compute-2 python3[30718]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:21:52 compute-2 sudo[30716]: pam_unix(sudo:session): session closed for user root
Jan 23 09:21:52 compute-2 sudo[30789]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlplnpqbgeokfbqfjiszhsuiqrznuolt ; /usr/bin/python3'
Jan 23 09:21:52 compute-2 sudo[30789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:52 compute-2 python3[30791]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.7014406-34066-141547719505615/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:21:52 compute-2 sudo[30789]: pam_unix(sudo:session): session closed for user root
Jan 23 09:22:05 compute-2 python3[30839]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:27:05 compute-2 sshd-session[29935]: Received disconnect from 38.129.56.17 port 54946:11: disconnected by user
Jan 23 09:27:05 compute-2 sshd-session[29935]: Disconnected from user zuul 38.129.56.17 port 54946
Jan 23 09:27:05 compute-2 sshd-session[29932]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:27:05 compute-2 systemd[1]: session-8.scope: Deactivated successfully.
Jan 23 09:27:05 compute-2 systemd[1]: session-8.scope: Consumed 4.494s CPU time.
Jan 23 09:27:05 compute-2 systemd-logind[786]: Session 8 logged out. Waiting for processes to exit.
Jan 23 09:27:05 compute-2 systemd-logind[786]: Removed session 8.
Jan 23 09:33:01 compute-2 anacron[2783]: Job `cron.daily' started
Jan 23 09:33:01 compute-2 anacron[2783]: Job `cron.daily' terminated
Jan 23 09:33:46 compute-2 systemd[1]: Starting dnf makecache...
Jan 23 09:33:46 compute-2 dnf[30850]: Failed determining last makecache time.
Jan 23 09:33:46 compute-2 dnf[30850]: delorean-openstack-barbican-42b4c41831408a8e323 316 kB/s |  13 kB     00:00
Jan 23 09:33:46 compute-2 dnf[30850]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 2.5 MB/s |  65 kB     00:00
Jan 23 09:33:46 compute-2 dnf[30850]: delorean-openstack-cinder-1c00d6490d88e436f26ef 1.3 MB/s |  32 kB     00:00
Jan 23 09:33:47 compute-2 dnf[30850]: delorean-python-stevedore-c4acc5639fd2329372142 245 kB/s | 131 kB     00:00
Jan 23 09:33:47 compute-2 dnf[30850]: delorean-python-cloudkitty-tests-tempest-2c80f8 293 kB/s |  32 kB     00:00
Jan 23 09:33:47 compute-2 dnf[30850]: delorean-os-refresh-config-9bfc52b5049be2d8de61 7.6 MB/s | 349 kB     00:00
Jan 23 09:33:48 compute-2 dnf[30850]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 214 kB/s |  42 kB     00:00
Jan 23 09:33:48 compute-2 dnf[30850]: delorean-python-designate-tests-tempest-347fdbc 620 kB/s |  18 kB     00:00
Jan 23 09:33:48 compute-2 dnf[30850]: delorean-openstack-glance-1fd12c29b339f30fe823e 600 kB/s |  18 kB     00:00
Jan 23 09:33:48 compute-2 dnf[30850]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 891 kB/s |  29 kB     00:00
Jan 23 09:33:48 compute-2 dnf[30850]: delorean-openstack-manila-3c01b7181572c95dac462 918 kB/s |  25 kB     00:00
Jan 23 09:33:48 compute-2 dnf[30850]: delorean-python-whitebox-neutron-tests-tempest- 5.5 MB/s | 154 kB     00:00
Jan 23 09:33:48 compute-2 dnf[30850]: delorean-openstack-octavia-ba397f07a7331190208c 221 kB/s |  26 kB     00:00
Jan 23 09:33:48 compute-2 dnf[30850]: delorean-openstack-watcher-c014f81a8647287f6dcc 399 kB/s |  16 kB     00:00
Jan 23 09:33:48 compute-2 dnf[30850]: delorean-ansible-config_template-5ccaa22121a7ff 218 kB/s | 7.4 kB     00:00
Jan 23 09:33:48 compute-2 dnf[30850]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 2.5 MB/s | 144 kB     00:00
Jan 23 09:33:48 compute-2 dnf[30850]: delorean-openstack-swift-dc98a8463506ac520c469a 301 kB/s |  14 kB     00:00
Jan 23 09:33:48 compute-2 dnf[30850]: delorean-python-tempestconf-8515371b7cceebd4282 758 kB/s |  53 kB     00:00
Jan 23 09:33:48 compute-2 dnf[30850]: delorean-openstack-heat-ui-013accbfd179753bc3f0 2.6 MB/s |  96 kB     00:00
Jan 23 09:33:48 compute-2 dnf[30850]: CentOS Stream 9 - BaseOS                         67 kB/s | 6.7 kB     00:00
Jan 23 09:33:49 compute-2 dnf[30850]: CentOS Stream 9 - AppStream                      65 kB/s | 6.8 kB     00:00
Jan 23 09:33:49 compute-2 dnf[30850]: CentOS Stream 9 - CRB                            58 kB/s | 6.6 kB     00:00
Jan 23 09:33:49 compute-2 dnf[30850]: CentOS Stream 9 - Extras packages                69 kB/s | 7.3 kB     00:00
Jan 23 09:33:49 compute-2 dnf[30850]: dlrn-antelope-testing                           7.0 MB/s | 1.1 MB     00:00
Jan 23 09:33:49 compute-2 dnf[30850]: dlrn-antelope-build-deps                         14 MB/s | 461 kB     00:00
Jan 23 09:33:50 compute-2 dnf[30850]: centos9-rabbitmq                                7.9 MB/s | 123 kB     00:00
Jan 23 09:33:50 compute-2 dnf[30850]: centos9-storage                                  19 MB/s | 415 kB     00:00
Jan 23 09:33:50 compute-2 dnf[30850]: centos9-opstools                                3.9 MB/s |  51 kB     00:00
Jan 23 09:33:50 compute-2 dnf[30850]: NFV SIG OpenvSwitch                              22 MB/s | 461 kB     00:00
Jan 23 09:33:52 compute-2 dnf[30850]: repo-setup-centos-appstream                      18 MB/s |  26 MB     00:01
Jan 23 09:33:58 compute-2 dnf[30850]: repo-setup-centos-baseos                         30 MB/s | 8.9 MB     00:00
Jan 23 09:34:00 compute-2 dnf[30850]: repo-setup-centos-highavailability               22 MB/s | 744 kB     00:00
Jan 23 09:34:00 compute-2 dnf[30850]: repo-setup-centos-powertools                     37 MB/s | 7.6 MB     00:00
Jan 23 09:34:03 compute-2 dnf[30850]: Extra Packages for Enterprise Linux 9 - x86_64   18 MB/s |  20 MB     00:01
Jan 23 09:34:21 compute-2 dnf[30850]: Metadata cache created.
Jan 23 09:34:21 compute-2 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 23 09:34:21 compute-2 systemd[1]: Finished dnf makecache.
Jan 23 09:34:21 compute-2 systemd[1]: dnf-makecache.service: Consumed 29.511s CPU time.
Jan 23 09:37:14 compute-2 sshd-session[30953]: Accepted publickey for zuul from 192.168.122.30 port 57938 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:37:14 compute-2 systemd-logind[786]: New session 9 of user zuul.
Jan 23 09:37:14 compute-2 systemd[1]: Started Session 9 of User zuul.
Jan 23 09:37:14 compute-2 sshd-session[30953]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:37:15 compute-2 python3.9[31106]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:37:16 compute-2 sudo[31285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlnuvklftybjuedizlzcpmdsnjvzoesx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161035.7405126-54-278061806575938/AnsiballZ_command.py'
Jan 23 09:37:16 compute-2 sudo[31285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:37:16 compute-2 python3.9[31287]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:37:24 compute-2 sudo[31285]: pam_unix(sudo:session): session closed for user root
Jan 23 09:37:29 compute-2 sshd-session[30956]: Connection closed by 192.168.122.30 port 57938
Jan 23 09:37:29 compute-2 sshd-session[30953]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:37:29 compute-2 systemd[1]: session-9.scope: Deactivated successfully.
Jan 23 09:37:29 compute-2 systemd[1]: session-9.scope: Consumed 8.364s CPU time.
Jan 23 09:37:29 compute-2 systemd-logind[786]: Session 9 logged out. Waiting for processes to exit.
Jan 23 09:37:29 compute-2 systemd-logind[786]: Removed session 9.
Jan 23 09:37:46 compute-2 sshd-session[31344]: Accepted publickey for zuul from 192.168.122.30 port 42762 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:37:46 compute-2 systemd-logind[786]: New session 10 of user zuul.
Jan 23 09:37:46 compute-2 systemd[1]: Started Session 10 of User zuul.
Jan 23 09:37:46 compute-2 sshd-session[31344]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:37:47 compute-2 python3.9[31497]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 23 09:37:48 compute-2 python3.9[31671]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:37:49 compute-2 sudo[31821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-limvugvhyaccxawfaotiwotfbrzlogqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161069.3298454-90-145941444694443/AnsiballZ_command.py'
Jan 23 09:37:49 compute-2 sudo[31821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:37:49 compute-2 python3.9[31823]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:37:49 compute-2 sudo[31821]: pam_unix(sudo:session): session closed for user root
Jan 23 09:37:51 compute-2 sudo[31974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwrkafkampyyytbzxvaqpjglruwngojh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161070.4223514-127-240491923566516/AnsiballZ_stat.py'
Jan 23 09:37:51 compute-2 sudo[31974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:37:51 compute-2 python3.9[31976]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:37:51 compute-2 sudo[31974]: pam_unix(sudo:session): session closed for user root
Jan 23 09:37:51 compute-2 sudo[32126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxakdrlujyactdxbmowjadskudtmfxeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161071.5684416-151-188950431856262/AnsiballZ_file.py'
Jan 23 09:37:51 compute-2 sudo[32126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:37:52 compute-2 python3.9[32128]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:37:52 compute-2 sudo[32126]: pam_unix(sudo:session): session closed for user root
Jan 23 09:37:52 compute-2 sudo[32278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iscsdehuljwiwuvfrknhzpjzdeekgljm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161072.428582-175-40845375630410/AnsiballZ_stat.py'
Jan 23 09:37:52 compute-2 sudo[32278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:37:52 compute-2 python3.9[32280]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:37:52 compute-2 sudo[32278]: pam_unix(sudo:session): session closed for user root
Jan 23 09:37:53 compute-2 sudo[32401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emeotsenpmmvsqbxfujpypeivwtiqoac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161072.428582-175-40845375630410/AnsiballZ_copy.py'
Jan 23 09:37:53 compute-2 sudo[32401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:37:53 compute-2 python3.9[32403]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161072.428582-175-40845375630410/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:37:53 compute-2 sudo[32401]: pam_unix(sudo:session): session closed for user root
Jan 23 09:37:54 compute-2 sudo[32553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dydeztxjgfruilcicaoorfissypkbckk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161074.1562943-220-175389263623256/AnsiballZ_setup.py'
Jan 23 09:37:54 compute-2 sudo[32553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:37:54 compute-2 python3.9[32555]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:37:54 compute-2 sudo[32553]: pam_unix(sudo:session): session closed for user root
Jan 23 09:37:55 compute-2 sudo[32709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvzspyhhihswibpvhuoypkmjjdaoxnur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161075.1562097-243-139341816725344/AnsiballZ_file.py'
Jan 23 09:37:55 compute-2 sudo[32709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:37:55 compute-2 python3.9[32711]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:37:55 compute-2 sudo[32709]: pam_unix(sudo:session): session closed for user root
Jan 23 09:37:56 compute-2 sudo[32861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnkddgogztfefjdnfrieinyqcpdjclxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161075.9155579-270-10825109842044/AnsiballZ_file.py'
Jan 23 09:37:56 compute-2 sudo[32861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:37:56 compute-2 python3.9[32863]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:37:56 compute-2 sudo[32861]: pam_unix(sudo:session): session closed for user root
Jan 23 09:37:57 compute-2 python3.9[33013]: ansible-ansible.builtin.service_facts Invoked
Jan 23 09:38:00 compute-2 python3.9[33266]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:38:01 compute-2 python3.9[33416]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:38:03 compute-2 python3.9[33570]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:38:04 compute-2 sudo[33726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkeajnujcmffyksytowulxtfwjoebbsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161083.7530637-415-223540983504397/AnsiballZ_setup.py'
Jan 23 09:38:04 compute-2 sudo[33726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:38:04 compute-2 python3.9[33728]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:38:04 compute-2 sudo[33726]: pam_unix(sudo:session): session closed for user root
Jan 23 09:38:05 compute-2 sudo[33810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfumtdiuszwkgrmarocqmjljyzxkpdht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161083.7530637-415-223540983504397/AnsiballZ_dnf.py'
Jan 23 09:38:05 compute-2 sudo[33810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:38:05 compute-2 python3.9[33812]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:38:57 compute-2 systemd[1]: Reloading.
Jan 23 09:38:57 compute-2 systemd-rc-local-generator[34012]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:38:57 compute-2 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 23 09:38:58 compute-2 systemd[1]: Reloading.
Jan 23 09:38:58 compute-2 systemd-rc-local-generator[34057]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:38:58 compute-2 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 23 09:38:58 compute-2 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 23 09:38:58 compute-2 systemd[1]: Reloading.
Jan 23 09:38:58 compute-2 systemd-rc-local-generator[34097]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:38:58 compute-2 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 23 09:38:58 compute-2 dbus-broker-launch[744]: Noticed file-system modification, trigger reload.
Jan 23 09:38:58 compute-2 dbus-broker-launch[744]: Noticed file-system modification, trigger reload.
Jan 23 09:38:58 compute-2 dbus-broker-launch[744]: Noticed file-system modification, trigger reload.
Jan 23 09:40:18 compute-2 kernel: SELinux:  Converting 2725 SID table entries...
Jan 23 09:40:18 compute-2 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 09:40:18 compute-2 kernel: SELinux:  policy capability open_perms=1
Jan 23 09:40:18 compute-2 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 09:40:18 compute-2 kernel: SELinux:  policy capability always_check_network=0
Jan 23 09:40:18 compute-2 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 09:40:18 compute-2 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 09:40:18 compute-2 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 09:40:18 compute-2 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 23 09:40:18 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 09:40:18 compute-2 systemd[1]: Starting man-db-cache-update.service...
Jan 23 09:40:18 compute-2 systemd[1]: Reloading.
Jan 23 09:40:18 compute-2 systemd-rc-local-generator[34420]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:40:18 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 09:40:19 compute-2 sudo[33810]: pam_unix(sudo:session): session closed for user root
Jan 23 09:40:20 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 09:40:20 compute-2 systemd[1]: Finished man-db-cache-update.service.
Jan 23 09:40:20 compute-2 systemd[1]: man-db-cache-update.service: Consumed 1.281s CPU time.
Jan 23 09:40:20 compute-2 systemd[1]: run-rd2a9490f43a342dd977239ca781a8fd0.service: Deactivated successfully.
Jan 23 09:40:31 compute-2 sudo[35332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqlbjavvtrxeliufchblzbeezfsivltt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161231.70204-452-38444830911590/AnsiballZ_command.py'
Jan 23 09:40:31 compute-2 sudo[35332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:40:32 compute-2 python3.9[35334]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:40:33 compute-2 sudo[35332]: pam_unix(sudo:session): session closed for user root
Jan 23 09:40:34 compute-2 sudo[35613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osfhjebkmahrtdtmktvikepvdvumfeim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161233.4898353-475-192743241329955/AnsiballZ_selinux.py'
Jan 23 09:40:34 compute-2 sudo[35613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:40:34 compute-2 python3.9[35615]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 23 09:40:34 compute-2 sudo[35613]: pam_unix(sudo:session): session closed for user root
Jan 23 09:40:35 compute-2 sudo[35765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swwuedbgvpsgxdidxygifwcdnzvmwvgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161235.3001368-507-183623840545291/AnsiballZ_command.py'
Jan 23 09:40:35 compute-2 sudo[35765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:40:35 compute-2 python3.9[35767]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 23 09:40:38 compute-2 sudo[35765]: pam_unix(sudo:session): session closed for user root
Jan 23 09:40:39 compute-2 sudo[35918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnwwljxqawtaywdjixbmmwhhwjqzimol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161239.5834932-532-115739653353382/AnsiballZ_file.py'
Jan 23 09:40:39 compute-2 sudo[35918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:40:40 compute-2 python3.9[35920]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:40:40 compute-2 sudo[35918]: pam_unix(sudo:session): session closed for user root
Jan 23 09:40:41 compute-2 sudo[36070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wppoyvocwrkxlshmoodypwkamxkdtdaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161241.0137575-556-1373063740452/AnsiballZ_mount.py'
Jan 23 09:40:41 compute-2 sudo[36070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:40:44 compute-2 python3.9[36073]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 23 09:40:44 compute-2 sudo[36070]: pam_unix(sudo:session): session closed for user root
Jan 23 09:40:45 compute-2 sudo[36223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmxtsaujsueteehxkhqdyrrwbnopfixn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161245.4583092-640-226143535725346/AnsiballZ_file.py'
Jan 23 09:40:45 compute-2 sudo[36223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:40:48 compute-2 python3.9[36225]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:40:48 compute-2 sudo[36223]: pam_unix(sudo:session): session closed for user root
Jan 23 09:40:49 compute-2 sudo[36375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jocxeohbgddlhwrtdglrazrrtuptywvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161248.79358-665-99883130303473/AnsiballZ_stat.py'
Jan 23 09:40:49 compute-2 sudo[36375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:40:49 compute-2 python3.9[36377]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:40:49 compute-2 sudo[36375]: pam_unix(sudo:session): session closed for user root
Jan 23 09:40:49 compute-2 sudo[36498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urbwzefxsiiuuoasgascqdyibgectzdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161248.79358-665-99883130303473/AnsiballZ_copy.py'
Jan 23 09:40:49 compute-2 sudo[36498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:40:49 compute-2 python3.9[36500]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161248.79358-665-99883130303473/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:40:49 compute-2 sudo[36498]: pam_unix(sudo:session): session closed for user root
Jan 23 09:40:56 compute-2 sudo[36650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldvotdjnzaokvmcnnnckoewuhgcnabad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161255.80237-736-31033391998718/AnsiballZ_stat.py'
Jan 23 09:40:56 compute-2 sudo[36650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:40:56 compute-2 python3.9[36652]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:40:56 compute-2 sudo[36650]: pam_unix(sudo:session): session closed for user root
Jan 23 09:40:57 compute-2 sudo[36802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezvrqsndficnfnqnhsygkgnxbyaeqfop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161256.4905078-760-250684465251154/AnsiballZ_command.py'
Jan 23 09:40:57 compute-2 sudo[36802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:40:57 compute-2 python3.9[36804]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:40:57 compute-2 sudo[36802]: pam_unix(sudo:session): session closed for user root
Jan 23 09:40:57 compute-2 sudo[36955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adujpafkywsfgbtuwjaeysbxbwsywbgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161257.507182-783-75291282741024/AnsiballZ_file.py'
Jan 23 09:40:57 compute-2 sudo[36955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:40:57 compute-2 python3.9[36957]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:40:57 compute-2 sudo[36955]: pam_unix(sudo:session): session closed for user root
Jan 23 09:40:58 compute-2 sudo[37107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqawghjmtyqrxvfbzmnshlbzgrxhxdix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161258.421959-817-116007897435719/AnsiballZ_getent.py'
Jan 23 09:40:58 compute-2 sudo[37107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:40:59 compute-2 python3.9[37109]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 23 09:40:59 compute-2 sudo[37107]: pam_unix(sudo:session): session closed for user root
Jan 23 09:40:59 compute-2 sudo[37260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkxvcdlequynzbbpnrbcjlaeszjivvzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161259.3715398-841-113246341043345/AnsiballZ_group.py'
Jan 23 09:40:59 compute-2 sudo[37260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:00 compute-2 python3.9[37262]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 09:41:00 compute-2 groupadd[37263]: group added to /etc/group: name=qemu, GID=107
Jan 23 09:41:00 compute-2 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 09:41:00 compute-2 groupadd[37263]: group added to /etc/gshadow: name=qemu
Jan 23 09:41:00 compute-2 groupadd[37263]: new group: name=qemu, GID=107
Jan 23 09:41:00 compute-2 sudo[37260]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:01 compute-2 sudo[37419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzeupolkyqzwmteqjqwiznbrykghhddu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161260.6486037-864-194333139866470/AnsiballZ_user.py'
Jan 23 09:41:01 compute-2 sudo[37419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:01 compute-2 python3.9[37421]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 09:41:01 compute-2 useradd[37423]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Jan 23 09:41:01 compute-2 sudo[37419]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:02 compute-2 sudo[37579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-peqofccleuksczdjejrrpcgmlrvyraeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161262.2200549-889-114856617012726/AnsiballZ_getent.py'
Jan 23 09:41:02 compute-2 sudo[37579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:02 compute-2 python3.9[37581]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 23 09:41:02 compute-2 sudo[37579]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:03 compute-2 sudo[37732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqrknwnfrijtynewvdawnbypxejatdvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161263.0153-913-254101687845106/AnsiballZ_group.py'
Jan 23 09:41:03 compute-2 sudo[37732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:03 compute-2 python3.9[37734]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 09:41:03 compute-2 groupadd[37735]: group added to /etc/group: name=hugetlbfs, GID=42477
Jan 23 09:41:03 compute-2 groupadd[37735]: group added to /etc/gshadow: name=hugetlbfs
Jan 23 09:41:03 compute-2 groupadd[37735]: new group: name=hugetlbfs, GID=42477
Jan 23 09:41:03 compute-2 sudo[37732]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:04 compute-2 sudo[37890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjrgefqgaaeaedetldqkiopcpraiksak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161264.076122-940-242182174843521/AnsiballZ_file.py'
Jan 23 09:41:04 compute-2 sudo[37890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:04 compute-2 python3.9[37892]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 23 09:41:04 compute-2 sudo[37890]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:05 compute-2 sudo[38042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqtttumgkqlfxmsrqzgtlmodfgdrsmjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161265.2069106-972-141788339620649/AnsiballZ_dnf.py'
Jan 23 09:41:05 compute-2 sudo[38042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:06 compute-2 python3.9[38044]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:41:09 compute-2 sudo[38042]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:11 compute-2 sudo[38195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogcivhctdlsmmrmcbfnworgutkxnpzia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161271.0612328-997-132189149445903/AnsiballZ_file.py'
Jan 23 09:41:11 compute-2 sudo[38195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:11 compute-2 python3.9[38197]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:41:11 compute-2 sudo[38195]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:12 compute-2 sudo[38347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imeuwcjesafzfsnbieqcgyckbwrzidrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161271.7453775-1021-210806067985898/AnsiballZ_stat.py'
Jan 23 09:41:12 compute-2 sudo[38347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:12 compute-2 python3.9[38349]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:41:12 compute-2 sudo[38347]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:12 compute-2 sudo[38470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvtbqlikoxcpvtqsnmprinxrqxoujsmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161271.7453775-1021-210806067985898/AnsiballZ_copy.py'
Jan 23 09:41:12 compute-2 sudo[38470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:12 compute-2 python3.9[38472]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769161271.7453775-1021-210806067985898/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:41:12 compute-2 sudo[38470]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:13 compute-2 sudo[38622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgozkadsckyxlavoirryhsqbizvbinfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161273.0326464-1066-280830217368509/AnsiballZ_systemd.py'
Jan 23 09:41:13 compute-2 sudo[38622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:13 compute-2 python3.9[38624]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:41:13 compute-2 systemd[1]: Starting Load Kernel Modules...
Jan 23 09:41:13 compute-2 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 23 09:41:13 compute-2 kernel: Bridge firewalling registered
Jan 23 09:41:13 compute-2 systemd-modules-load[38628]: Inserted module 'br_netfilter'
Jan 23 09:41:13 compute-2 systemd[1]: Finished Load Kernel Modules.
Jan 23 09:41:14 compute-2 sudo[38622]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:14 compute-2 sudo[38781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jevpsnfampwpxgzeobmgxhsysagrbleg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161274.2259042-1090-119051593418181/AnsiballZ_stat.py'
Jan 23 09:41:14 compute-2 sudo[38781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:14 compute-2 python3.9[38783]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:41:14 compute-2 sudo[38781]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:15 compute-2 sudo[38904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrirpasgluckfgnyepiejwigwzynddqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161274.2259042-1090-119051593418181/AnsiballZ_copy.py'
Jan 23 09:41:15 compute-2 sudo[38904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:15 compute-2 python3.9[38906]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769161274.2259042-1090-119051593418181/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:41:15 compute-2 sudo[38904]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:16 compute-2 sudo[39056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhqcrlnruqfokcsbneooexugeqaegpmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161275.7798123-1143-187251827967797/AnsiballZ_dnf.py'
Jan 23 09:41:16 compute-2 sudo[39056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:16 compute-2 python3.9[39058]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:41:20 compute-2 dbus-broker-launch[744]: Noticed file-system modification, trigger reload.
Jan 23 09:41:20 compute-2 dbus-broker-launch[744]: Noticed file-system modification, trigger reload.
Jan 23 09:41:20 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 09:41:20 compute-2 systemd[1]: Starting man-db-cache-update.service...
Jan 23 09:41:20 compute-2 systemd[1]: Reloading.
Jan 23 09:41:21 compute-2 systemd-rc-local-generator[39124]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:41:21 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 09:41:21 compute-2 sudo[39056]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:23 compute-2 python3.9[40816]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:41:23 compute-2 python3.9[41892]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 23 09:41:24 compute-2 python3.9[42736]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:41:25 compute-2 sudo[43138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wldifomplyricpmdbuokgltlcxszizzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161285.0479877-1261-99608663731219/AnsiballZ_command.py'
Jan 23 09:41:25 compute-2 sudo[43138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:25 compute-2 python3.9[43140]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:41:25 compute-2 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 23 09:41:26 compute-2 systemd[1]: Starting Authorization Manager...
Jan 23 09:41:26 compute-2 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 23 09:41:26 compute-2 polkitd[43445]: Started polkitd version 0.117
Jan 23 09:41:26 compute-2 polkitd[43445]: Loading rules from directory /etc/polkit-1/rules.d
Jan 23 09:41:26 compute-2 polkitd[43445]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 23 09:41:26 compute-2 polkitd[43445]: Finished loading, compiling and executing 2 rules
Jan 23 09:41:26 compute-2 polkitd[43445]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Jan 23 09:41:26 compute-2 systemd[1]: Started Authorization Manager.
Jan 23 09:41:26 compute-2 sudo[43138]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:27 compute-2 sudo[43613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zshzsdlkwylxmafwizhqpdczgnlnfwku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161287.267287-1288-185719131921024/AnsiballZ_systemd.py'
Jan 23 09:41:27 compute-2 sudo[43613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:27 compute-2 python3.9[43615]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:41:27 compute-2 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 23 09:41:28 compute-2 systemd[1]: tuned.service: Deactivated successfully.
Jan 23 09:41:28 compute-2 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 23 09:41:28 compute-2 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 23 09:41:28 compute-2 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 23 09:41:28 compute-2 sudo[43613]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:28 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 09:41:28 compute-2 systemd[1]: Finished man-db-cache-update.service.
Jan 23 09:41:28 compute-2 systemd[1]: man-db-cache-update.service: Consumed 4.477s CPU time.
Jan 23 09:41:28 compute-2 systemd[1]: run-r1b1611b0f9d34e309c0d721a124079b8.service: Deactivated successfully.
Jan 23 09:41:28 compute-2 python3.9[43778]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 23 09:41:32 compute-2 sudo[43928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnxqmenpdorfukbzycjxxsmqovxtdtze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161292.1618316-1459-279828046899324/AnsiballZ_systemd.py'
Jan 23 09:41:32 compute-2 sudo[43928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:32 compute-2 python3.9[43930]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:41:32 compute-2 systemd[1]: Reloading.
Jan 23 09:41:32 compute-2 systemd-rc-local-generator[43959]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:41:33 compute-2 sudo[43928]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:33 compute-2 sudo[44118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyzmhubtkyiawnxifrokhcpzdrnhlmen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161293.1585329-1459-38378634435164/AnsiballZ_systemd.py'
Jan 23 09:41:33 compute-2 sudo[44118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:33 compute-2 python3.9[44120]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:41:33 compute-2 systemd[1]: Reloading.
Jan 23 09:41:33 compute-2 systemd-rc-local-generator[44148]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:41:33 compute-2 sudo[44118]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:34 compute-2 sudo[44307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gymwcfvlknyffrkzkbndllfcxorddpkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161294.4614744-1507-17557869253691/AnsiballZ_command.py'
Jan 23 09:41:34 compute-2 sudo[44307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:34 compute-2 python3.9[44309]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:41:34 compute-2 sudo[44307]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:35 compute-2 sudo[44461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyyzbohhburzwcksxlrhzaqqktkkiycx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161295.3893282-1531-91862785933803/AnsiballZ_command.py'
Jan 23 09:41:35 compute-2 sudo[44461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:35 compute-2 python3.9[44463]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:41:35 compute-2 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 23 09:41:35 compute-2 sudo[44461]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:36 compute-2 sudo[44614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhqmiykyjtrnrdurwrhnhwzxdfsaetis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161296.1776345-1555-257843594721765/AnsiballZ_command.py'
Jan 23 09:41:36 compute-2 sudo[44614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:36 compute-2 python3.9[44616]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:41:37 compute-2 sshd-session[44358]: Connection closed by 185.247.137.36 port 56667
Jan 23 09:41:37 compute-2 sshd-session[44622]: Connection closed by 185.247.137.36 port 38229 [preauth]
Jan 23 09:41:38 compute-2 sudo[44614]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:38 compute-2 sudo[44778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-peuursifaqqxpbsdqwvvkkcrsrxzqquj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161298.3836126-1579-177731626122737/AnsiballZ_command.py'
Jan 23 09:41:38 compute-2 sudo[44778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:38 compute-2 python3.9[44780]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:41:38 compute-2 sudo[44778]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:39 compute-2 sudo[44931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvhnppoyuzrjfhjuousmqunaiaxftegt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161299.1862707-1603-96473383642716/AnsiballZ_systemd.py'
Jan 23 09:41:39 compute-2 sudo[44931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:39 compute-2 python3.9[44933]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:41:39 compute-2 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 23 09:41:39 compute-2 systemd[1]: Stopped Apply Kernel Variables.
Jan 23 09:41:39 compute-2 systemd[1]: Stopping Apply Kernel Variables...
Jan 23 09:41:39 compute-2 systemd[1]: Starting Apply Kernel Variables...
Jan 23 09:41:39 compute-2 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 23 09:41:39 compute-2 systemd[1]: Finished Apply Kernel Variables.
Jan 23 09:41:39 compute-2 sudo[44931]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:40 compute-2 sshd-session[31347]: Connection closed by 192.168.122.30 port 42762
Jan 23 09:41:40 compute-2 sshd-session[31344]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:41:40 compute-2 systemd[1]: session-10.scope: Deactivated successfully.
Jan 23 09:41:40 compute-2 systemd[1]: session-10.scope: Consumed 2min 33.798s CPU time.
Jan 23 09:41:40 compute-2 systemd-logind[786]: Session 10 logged out. Waiting for processes to exit.
Jan 23 09:41:40 compute-2 systemd-logind[786]: Removed session 10.
Jan 23 09:41:47 compute-2 sshd-session[44964]: Accepted publickey for zuul from 192.168.122.30 port 45902 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:41:47 compute-2 systemd-logind[786]: New session 11 of user zuul.
Jan 23 09:41:47 compute-2 systemd[1]: Started Session 11 of User zuul.
Jan 23 09:41:47 compute-2 sshd-session[44964]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:41:48 compute-2 python3.9[45117]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:41:49 compute-2 sudo[45271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnptgupyxmosialwpcqkmhjadlupvxia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161309.203307-65-231060974320885/AnsiballZ_getent.py'
Jan 23 09:41:49 compute-2 sudo[45271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:50 compute-2 python3.9[45273]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 23 09:41:50 compute-2 sudo[45271]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:50 compute-2 sudo[45424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxlkcougsvilxxdxufvpxarvtdoagwba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161310.4782412-89-264181862375334/AnsiballZ_group.py'
Jan 23 09:41:50 compute-2 sudo[45424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:51 compute-2 python3.9[45426]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 09:41:51 compute-2 groupadd[45427]: group added to /etc/group: name=openvswitch, GID=42476
Jan 23 09:41:51 compute-2 groupadd[45427]: group added to /etc/gshadow: name=openvswitch
Jan 23 09:41:51 compute-2 groupadd[45427]: new group: name=openvswitch, GID=42476
Jan 23 09:41:51 compute-2 sudo[45424]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:52 compute-2 sudo[45582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jakmituxseydyvlcbwfizqqskndxvwym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161311.8866878-113-49789832735969/AnsiballZ_user.py'
Jan 23 09:41:52 compute-2 sudo[45582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:52 compute-2 python3.9[45584]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 09:41:52 compute-2 useradd[45586]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Jan 23 09:41:52 compute-2 useradd[45586]: add 'openvswitch' to group 'hugetlbfs'
Jan 23 09:41:52 compute-2 useradd[45586]: add 'openvswitch' to shadow group 'hugetlbfs'
Jan 23 09:41:52 compute-2 sudo[45582]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:55 compute-2 sudo[45742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mztgkgroponpztgcziklqkontcjojpap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161315.4784236-143-136410379671518/AnsiballZ_setup.py'
Jan 23 09:41:55 compute-2 sudo[45742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:56 compute-2 python3.9[45744]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:41:56 compute-2 sudo[45742]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:56 compute-2 sudo[45826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpwcgkepmfnlfuwmsywqhdelmqigfwxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161315.4784236-143-136410379671518/AnsiballZ_dnf.py'
Jan 23 09:41:56 compute-2 sudo[45826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:56 compute-2 python3.9[45828]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 09:41:59 compute-2 sudo[45826]: pam_unix(sudo:session): session closed for user root
Jan 23 09:42:00 compute-2 sudo[45989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdluwqjvkiqrfhjjmywhyanpkapzwski ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161320.191208-185-203212878247698/AnsiballZ_dnf.py'
Jan 23 09:42:00 compute-2 sudo[45989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:42:00 compute-2 python3.9[45991]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:42:16 compute-2 kernel: SELinux:  Converting 2737 SID table entries...
Jan 23 09:42:16 compute-2 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 09:42:16 compute-2 kernel: SELinux:  policy capability open_perms=1
Jan 23 09:42:16 compute-2 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 09:42:16 compute-2 kernel: SELinux:  policy capability always_check_network=0
Jan 23 09:42:16 compute-2 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 09:42:16 compute-2 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 09:42:16 compute-2 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 09:42:16 compute-2 groupadd[46015]: group added to /etc/group: name=unbound, GID=994
Jan 23 09:42:16 compute-2 groupadd[46015]: group added to /etc/gshadow: name=unbound
Jan 23 09:42:16 compute-2 groupadd[46015]: new group: name=unbound, GID=994
Jan 23 09:42:16 compute-2 useradd[46022]: new user: name=unbound, UID=993, GID=994, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Jan 23 09:42:16 compute-2 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 23 09:42:16 compute-2 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 23 09:42:17 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 09:42:17 compute-2 systemd[1]: Starting man-db-cache-update.service...
Jan 23 09:42:17 compute-2 systemd[1]: Reloading.
Jan 23 09:42:18 compute-2 systemd-rc-local-generator[46518]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:42:18 compute-2 systemd-sysv-generator[46522]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:42:18 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 09:42:18 compute-2 sudo[45989]: pam_unix(sudo:session): session closed for user root
Jan 23 09:42:18 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 09:42:18 compute-2 systemd[1]: Finished man-db-cache-update.service.
Jan 23 09:42:18 compute-2 systemd[1]: run-r66549dc4f67642688b1013512a0a2329.service: Deactivated successfully.
Jan 23 09:42:24 compute-2 sudo[47088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybbiaykfdhruazellvszotweseftcwsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161343.9774604-210-32324717793755/AnsiballZ_systemd.py'
Jan 23 09:42:24 compute-2 sudo[47088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:42:24 compute-2 python3.9[47090]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 09:42:24 compute-2 systemd[1]: Reloading.
Jan 23 09:42:24 compute-2 systemd-sysv-generator[47123]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:42:24 compute-2 systemd-rc-local-generator[47120]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:42:25 compute-2 systemd[1]: Starting Open vSwitch Database Unit...
Jan 23 09:42:25 compute-2 chown[47131]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 23 09:42:25 compute-2 ovs-ctl[47136]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 23 09:42:25 compute-2 ovs-ctl[47136]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 23 09:42:25 compute-2 ovs-ctl[47136]: Starting ovsdb-server [  OK  ]
Jan 23 09:42:25 compute-2 ovs-vsctl[47185]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 23 09:42:25 compute-2 ovs-vsctl[47201]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"8fb585ea-168c-48ac-870f-617a4fa1bbde\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 23 09:42:25 compute-2 ovs-ctl[47136]: Configuring Open vSwitch system IDs [  OK  ]
Jan 23 09:42:25 compute-2 ovs-ctl[47136]: Enabling remote OVSDB managers [  OK  ]
Jan 23 09:42:25 compute-2 ovs-vsctl[47210]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Jan 23 09:42:25 compute-2 systemd[1]: Started Open vSwitch Database Unit.
Jan 23 09:42:25 compute-2 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 23 09:42:25 compute-2 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 23 09:42:25 compute-2 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 23 09:42:25 compute-2 kernel: openvswitch: Open vSwitch switching datapath
Jan 23 09:42:25 compute-2 ovs-ctl[47255]: Inserting openvswitch module [  OK  ]
Jan 23 09:42:25 compute-2 ovs-ctl[47224]: Starting ovs-vswitchd [  OK  ]
Jan 23 09:42:25 compute-2 ovs-vsctl[47272]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Jan 23 09:42:25 compute-2 ovs-ctl[47224]: Enabling remote OVSDB managers [  OK  ]
Jan 23 09:42:25 compute-2 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 23 09:42:25 compute-2 systemd[1]: Starting Open vSwitch...
Jan 23 09:42:25 compute-2 systemd[1]: Finished Open vSwitch.
Jan 23 09:42:25 compute-2 sudo[47088]: pam_unix(sudo:session): session closed for user root
Jan 23 09:42:27 compute-2 python3.9[47424]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:42:28 compute-2 sudo[47574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktvcrohefyyhxhbjjunlwajnuudugoaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161347.8420563-263-258673690176256/AnsiballZ_sefcontext.py'
Jan 23 09:42:28 compute-2 sudo[47574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:42:28 compute-2 python3.9[47576]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 23 09:42:29 compute-2 kernel: SELinux:  Converting 2751 SID table entries...
Jan 23 09:42:29 compute-2 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 09:42:29 compute-2 kernel: SELinux:  policy capability open_perms=1
Jan 23 09:42:29 compute-2 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 09:42:29 compute-2 kernel: SELinux:  policy capability always_check_network=0
Jan 23 09:42:29 compute-2 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 09:42:29 compute-2 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 09:42:29 compute-2 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 09:42:30 compute-2 sudo[47574]: pam_unix(sudo:session): session closed for user root
Jan 23 09:42:31 compute-2 python3.9[47731]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:42:32 compute-2 sudo[47887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agdlqgezphhxtyalhpuowhmtkqtksiem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161352.2946272-317-175346688154374/AnsiballZ_dnf.py'
Jan 23 09:42:32 compute-2 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 23 09:42:32 compute-2 sudo[47887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:42:32 compute-2 python3.9[47889]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:42:34 compute-2 sudo[47887]: pam_unix(sudo:session): session closed for user root
Jan 23 09:42:35 compute-2 sudo[48040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eonxmxfnnvazomuifijclwuabvpjhedd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161354.8846679-342-87592359988325/AnsiballZ_command.py'
Jan 23 09:42:35 compute-2 sudo[48040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:42:35 compute-2 python3.9[48042]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:42:36 compute-2 sudo[48040]: pam_unix(sudo:session): session closed for user root
Jan 23 09:42:37 compute-2 sudo[48327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rehlelrlsmvkeptdutghwdotzkdnsafa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161356.651458-365-7097056149108/AnsiballZ_file.py'
Jan 23 09:42:37 compute-2 sudo[48327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:42:37 compute-2 python3.9[48329]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 23 09:42:37 compute-2 sudo[48327]: pam_unix(sudo:session): session closed for user root
Jan 23 09:42:38 compute-2 python3.9[48479]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:42:38 compute-2 sudo[48631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-belngjkzszrpdabgidvosuldztrtrfba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161358.4595554-413-166775105894749/AnsiballZ_dnf.py'
Jan 23 09:42:38 compute-2 sudo[48631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:42:38 compute-2 python3.9[48633]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:42:41 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 09:42:41 compute-2 systemd[1]: Starting man-db-cache-update.service...
Jan 23 09:42:41 compute-2 systemd[1]: Reloading.
Jan 23 09:42:41 compute-2 systemd-rc-local-generator[48672]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:42:41 compute-2 systemd-sysv-generator[48675]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:42:41 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 09:42:41 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 09:42:41 compute-2 systemd[1]: Finished man-db-cache-update.service.
Jan 23 09:42:41 compute-2 systemd[1]: run-rcdd47222d767459e9fc79b43eeb5d29d.service: Deactivated successfully.
Jan 23 09:42:41 compute-2 sudo[48631]: pam_unix(sudo:session): session closed for user root
Jan 23 09:42:44 compute-2 sudo[48949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifxdntszpkkmyzkmncybysyqifpumzor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161364.3200905-437-248279017309178/AnsiballZ_systemd.py'
Jan 23 09:42:44 compute-2 sudo[48949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:42:45 compute-2 python3.9[48951]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:42:45 compute-2 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 23 09:42:45 compute-2 systemd[1]: Stopped Network Manager Wait Online.
Jan 23 09:42:45 compute-2 systemd[1]: Stopping Network Manager Wait Online...
Jan 23 09:42:45 compute-2 NetworkManager[7219]: <info>  [1769161365.1061] caught SIGTERM, shutting down normally.
Jan 23 09:42:45 compute-2 NetworkManager[7219]: <info>  [1769161365.1072] dhcp4 (eth0): canceled DHCP transaction
Jan 23 09:42:45 compute-2 NetworkManager[7219]: <info>  [1769161365.1072] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 09:42:45 compute-2 NetworkManager[7219]: <info>  [1769161365.1072] dhcp4 (eth0): state changed no lease
Jan 23 09:42:45 compute-2 NetworkManager[7219]: <info>  [1769161365.1074] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 09:42:45 compute-2 systemd[1]: Stopping Network Manager...
Jan 23 09:42:45 compute-2 NetworkManager[7219]: <info>  [1769161365.1134] exiting (success)
Jan 23 09:42:45 compute-2 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 09:42:45 compute-2 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 09:42:45 compute-2 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 23 09:42:45 compute-2 systemd[1]: Stopped Network Manager.
Jan 23 09:42:45 compute-2 systemd[1]: NetworkManager.service: Consumed 13.287s CPU time, 4.1M memory peak, read 0B from disk, written 32.5K to disk.
Jan 23 09:42:45 compute-2 systemd[1]: Starting Network Manager...
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.1767] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:20df1b08-a5ba-4a35-8d47-00aa8e9b2616)
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.1768] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.1829] manager[0x5622b241e000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 23 09:42:45 compute-2 systemd[1]: Starting Hostname Service...
Jan 23 09:42:45 compute-2 systemd[1]: Started Hostname Service.
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2602] hostname: hostname: using hostnamed
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2603] hostname: static hostname changed from (none) to "compute-2"
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2608] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2612] manager[0x5622b241e000]: rfkill: Wi-Fi hardware radio set enabled
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2613] manager[0x5622b241e000]: rfkill: WWAN hardware radio set enabled
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2638] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2647] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2648] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2649] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2650] manager: Networking is enabled by state file
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2652] settings: Loaded settings plugin: keyfile (internal)
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2656] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2685] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2696] dhcp: init: Using DHCP client 'internal'
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2700] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2708] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2713] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2722] device (lo): Activation: starting connection 'lo' (a94dd518-f501-4cf9-bb13-731d2edd38ea)
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2729] device (eth0): carrier: link connected
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2733] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2739] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2739] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2746] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2754] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2760] device (eth1): carrier: link connected
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2765] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2771] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (8b069e9e-bd63-5e9d-bdd1-b5c43b66b918) (indicated)
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2771] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2776] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2784] device (eth1): Activation: starting connection 'ci-private-network' (8b069e9e-bd63-5e9d-bdd1-b5c43b66b918)
Jan 23 09:42:45 compute-2 systemd[1]: Started Network Manager.
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2792] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2800] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2802] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2803] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2805] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2809] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2812] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2814] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2817] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2825] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2829] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2838] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2850] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2866] dhcp4 (eth0): state changed new lease, address=38.129.56.185
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2873] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2953] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2959] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2961] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2962] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2966] device (lo): Activation: successful, device activated.
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2971] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2975] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2978] device (eth1): Activation: successful, device activated.
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2986] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2989] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2991] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2993] device (eth0): Activation: successful, device activated.
Jan 23 09:42:45 compute-2 systemd[1]: Starting Network Manager Wait Online...
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2997] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 23 09:42:45 compute-2 NetworkManager[48964]: <info>  [1769161365.2999] manager: startup complete
Jan 23 09:42:45 compute-2 sudo[48949]: pam_unix(sudo:session): session closed for user root
Jan 23 09:42:45 compute-2 systemd[1]: Finished Network Manager Wait Online.
Jan 23 09:42:46 compute-2 sudo[49175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-douxvfnitpucbencvfvjgjdzresuzgoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161365.7103286-461-193575409214165/AnsiballZ_dnf.py'
Jan 23 09:42:46 compute-2 sudo[49175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:42:46 compute-2 python3.9[49177]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:42:55 compute-2 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 09:42:58 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 09:42:58 compute-2 systemd[1]: Starting man-db-cache-update.service...
Jan 23 09:42:58 compute-2 systemd[1]: Reloading.
Jan 23 09:42:58 compute-2 systemd-rc-local-generator[49232]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:42:58 compute-2 systemd-sysv-generator[49236]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:42:58 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 09:42:59 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 09:42:59 compute-2 systemd[1]: Finished man-db-cache-update.service.
Jan 23 09:42:59 compute-2 systemd[1]: run-r938a9d7fd35b453092074315e15b1aeb.service: Deactivated successfully.
Jan 23 09:42:59 compute-2 sudo[49175]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:00 compute-2 sudo[49635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-geachvjjhzwstppayclmqpjsgmobdhnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161379.8754096-498-233273033725735/AnsiballZ_stat.py'
Jan 23 09:43:00 compute-2 sudo[49635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:00 compute-2 python3.9[49637]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:43:00 compute-2 sudo[49635]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:01 compute-2 sudo[49787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhloelfjmhaorkfjryblenmafvpuhuir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161380.5934925-524-8114840170081/AnsiballZ_ini_file.py'
Jan 23 09:43:01 compute-2 sudo[49787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:01 compute-2 python3.9[49789]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:43:01 compute-2 sudo[49787]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:01 compute-2 sudo[49941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afhmseudmlhkjxwgmwxcqtmpslnrercr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161381.5810084-554-225366783723257/AnsiballZ_ini_file.py'
Jan 23 09:43:01 compute-2 sudo[49941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:02 compute-2 python3.9[49943]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:43:02 compute-2 sudo[49941]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:02 compute-2 sudo[50093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyeumawuvkpdqthzwngakyujymhvbqch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161382.2657685-554-178142873197502/AnsiballZ_ini_file.py'
Jan 23 09:43:02 compute-2 sudo[50093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:02 compute-2 python3.9[50095]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:43:02 compute-2 sudo[50093]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:03 compute-2 sudo[50245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uijbhpcowbvtxhfmjqaddooolxuzntmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161382.9702065-599-28273578273256/AnsiballZ_ini_file.py'
Jan 23 09:43:03 compute-2 sudo[50245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:03 compute-2 python3.9[50247]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:43:03 compute-2 sudo[50245]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:03 compute-2 sudo[50397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nczivgaucrljxxtwiagxmtqbjhhucxtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161383.5808864-599-144243938773258/AnsiballZ_ini_file.py'
Jan 23 09:43:03 compute-2 sudo[50397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:04 compute-2 python3.9[50399]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:43:04 compute-2 sudo[50397]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:04 compute-2 sudo[50549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwzdoojrbtgpqmemuzaoyrdccvswqxli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161384.2416582-644-72844948997566/AnsiballZ_stat.py'
Jan 23 09:43:04 compute-2 sudo[50549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:04 compute-2 python3.9[50551]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:43:04 compute-2 sudo[50549]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:05 compute-2 sudo[50672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liqymfprxwpmeudvcodiruclyeaqwsyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161384.2416582-644-72844948997566/AnsiballZ_copy.py'
Jan 23 09:43:05 compute-2 sudo[50672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:05 compute-2 python3.9[50674]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161384.2416582-644-72844948997566/.source _original_basename=.gxspqu7q follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:43:05 compute-2 sudo[50672]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:05 compute-2 sudo[50824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehasvrkhdnzaonoxivnuwmgkxqaquhqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161385.6921258-689-148961194118239/AnsiballZ_file.py'
Jan 23 09:43:05 compute-2 sudo[50824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:06 compute-2 python3.9[50826]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:43:06 compute-2 sudo[50824]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:06 compute-2 sudo[50976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzlzdcwcmtfamizysuiptfepsiwophec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161386.379956-713-127894932342372/AnsiballZ_edpm_os_net_config_mappings.py'
Jan 23 09:43:06 compute-2 sudo[50976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:07 compute-2 python3.9[50978]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 23 09:43:07 compute-2 sudo[50976]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:07 compute-2 sudo[51128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwexlzpzihrywbxxdldzbqcthhedivae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161387.3181531-740-253318397637643/AnsiballZ_file.py'
Jan 23 09:43:07 compute-2 sudo[51128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:07 compute-2 python3.9[51130]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:43:07 compute-2 sudo[51128]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:08 compute-2 sudo[51280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trrmcnqmlwyxbbnkvsyyyjbyulahbcxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161388.246369-770-239534767465883/AnsiballZ_stat.py'
Jan 23 09:43:08 compute-2 sudo[51280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:08 compute-2 sudo[51280]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:09 compute-2 sudo[51403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahlulwgrxldarydovwqxgsadutiddwgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161388.246369-770-239534767465883/AnsiballZ_copy.py'
Jan 23 09:43:09 compute-2 sudo[51403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:09 compute-2 sudo[51403]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:10 compute-2 sudo[51555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbixzcqgfmlpjiddsinznyfwhwchdgdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161389.7531445-815-47682335986215/AnsiballZ_slurp.py'
Jan 23 09:43:10 compute-2 sudo[51555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:10 compute-2 python3.9[51557]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 23 09:43:10 compute-2 sudo[51555]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:11 compute-2 sudo[51730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfofkrspnzxvurdehhaqhnrnqbuhejjk ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161390.690089-842-208224040620380/async_wrapper.py j829007530849 300 /home/zuul/.ansible/tmp/ansible-tmp-1769161390.690089-842-208224040620380/AnsiballZ_edpm_os_net_config.py _'
Jan 23 09:43:11 compute-2 sudo[51730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:11 compute-2 ansible-async_wrapper.py[51732]: Invoked with j829007530849 300 /home/zuul/.ansible/tmp/ansible-tmp-1769161390.690089-842-208224040620380/AnsiballZ_edpm_os_net_config.py _
Jan 23 09:43:11 compute-2 ansible-async_wrapper.py[51735]: Starting module and watcher
Jan 23 09:43:11 compute-2 ansible-async_wrapper.py[51735]: Start watching 51736 (300)
Jan 23 09:43:11 compute-2 ansible-async_wrapper.py[51736]: Start module (51736)
Jan 23 09:43:11 compute-2 ansible-async_wrapper.py[51732]: Return async_wrapper task started.
Jan 23 09:43:11 compute-2 sudo[51730]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:11 compute-2 python3.9[51737]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 23 09:43:12 compute-2 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 23 09:43:12 compute-2 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 23 09:43:12 compute-2 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 23 09:43:12 compute-2 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 23 09:43:12 compute-2 kernel: cfg80211: failed to load regulatory.db
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6062] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51738 uid=0 result="success"
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6082] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51738 uid=0 result="success"
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6581] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6584] audit: op="connection-add" uuid="8afb3294-ce59-4632-a155-fd329a29f291" name="br-ex-br" pid=51738 uid=0 result="success"
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6599] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6601] audit: op="connection-add" uuid="41bf286c-1626-4e6a-a1a5-43899fda1607" name="br-ex-port" pid=51738 uid=0 result="success"
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6611] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6613] audit: op="connection-add" uuid="d5093589-ee87-4ea7-8b7c-afde35c70b3c" name="eth1-port" pid=51738 uid=0 result="success"
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6624] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6626] audit: op="connection-add" uuid="0ba7ccec-a493-41fb-8252-9c804fe27a7d" name="vlan20-port" pid=51738 uid=0 result="success"
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6636] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6638] audit: op="connection-add" uuid="c0864541-42de-4ccd-af05-4c7ce87a2504" name="vlan21-port" pid=51738 uid=0 result="success"
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6647] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6649] audit: op="connection-add" uuid="e9160306-09c2-4c99-9b9a-e0cda1df1e76" name="vlan22-port" pid=51738 uid=0 result="success"
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6658] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6659] audit: op="connection-add" uuid="1050024d-b7d9-4ad3-83a2-892c0a4b608a" name="vlan23-port" pid=51738 uid=0 result="success"
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6676] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-timeout,ipv4.dhcp-client-id,ipv6.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,802-3-ethernet.mtu,connection.autoconnect-priority,connection.timestamp" pid=51738 uid=0 result="success"
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6690] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6691] audit: op="connection-add" uuid="3a35641e-c187-4912-8bb0-af4e928650e9" name="br-ex-if" pid=51738 uid=0 result="success"
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6719] audit: op="connection-update" uuid="8b069e9e-bd63-5e9d-bdd1-b5c43b66b918" name="ci-private-network" args="ipv4.addresses,ipv4.dns,ipv4.method,ipv4.never-default,ipv4.routes,ipv4.routing-rules,ipv6.routes,ipv6.addresses,ipv6.addr-gen-mode,ipv6.dns,ipv6.routing-rules,ipv6.method,ovs-external-ids.data,connection.port-type,connection.controller,connection.master,connection.timestamp,connection.slave-type,ovs-interface.type" pid=51738 uid=0 result="success"
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6732] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6734] audit: op="connection-add" uuid="f54a3e8b-d986-4d33-b8e7-13d4b8990616" name="vlan20-if" pid=51738 uid=0 result="success"
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6747] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6749] audit: op="connection-add" uuid="59fcfee2-78b9-4dfc-8119-3ad866d888c1" name="vlan21-if" pid=51738 uid=0 result="success"
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6762] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6764] audit: op="connection-add" uuid="76b43057-748b-4c3d-94fd-248bcced744b" name="vlan22-if" pid=51738 uid=0 result="success"
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6778] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6780] audit: op="connection-add" uuid="aedd85d4-378a-4eb8-b14f-77b720fda39f" name="vlan23-if" pid=51738 uid=0 result="success"
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6791] audit: op="connection-delete" uuid="52728e87-b91d-3812-9239-09489880e5d3" name="Wired connection 1" pid=51738 uid=0 result="success"
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6801] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <warn>  [1769161393.6805] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6812] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6817] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (8afb3294-ce59-4632-a155-fd329a29f291)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6818] audit: op="connection-activate" uuid="8afb3294-ce59-4632-a155-fd329a29f291" name="br-ex-br" pid=51738 uid=0 result="success"
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6820] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <warn>  [1769161393.6822] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6827] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6830] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (41bf286c-1626-4e6a-a1a5-43899fda1607)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6832] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <warn>  [1769161393.6833] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6837] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6841] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (d5093589-ee87-4ea7-8b7c-afde35c70b3c)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6843] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <warn>  [1769161393.6845] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6849] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6852] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (0ba7ccec-a493-41fb-8252-9c804fe27a7d)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6854] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <warn>  [1769161393.6855] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6859] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6863] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (c0864541-42de-4ccd-af05-4c7ce87a2504)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6865] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <warn>  [1769161393.6866] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6870] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6874] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (e9160306-09c2-4c99-9b9a-e0cda1df1e76)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6876] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <warn>  [1769161393.6878] device (vlan23)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6882] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6885] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (1050024d-b7d9-4ad3-83a2-892c0a4b608a)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6887] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6890] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6892] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6898] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <warn>  [1769161393.6900] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6903] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6908] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (3a35641e-c187-4912-8bb0-af4e928650e9)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6909] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6912] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6914] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6915] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6917] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6925] device (eth1): disconnecting for new activation request.
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6926] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6930] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6932] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6933] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6936] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <warn>  [1769161393.6937] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6941] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6944] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (f54a3e8b-d986-4d33-b8e7-13d4b8990616)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6945] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6948] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6950] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6951] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6954] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <warn>  [1769161393.6955] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6958] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6962] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (59fcfee2-78b9-4dfc-8119-3ad866d888c1)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6963] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6966] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6967] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6969] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6971] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <warn>  [1769161393.6972] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6975] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6979] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (76b43057-748b-4c3d-94fd-248bcced744b)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6980] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6982] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6984] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6985] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6988] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <warn>  [1769161393.6990] device (vlan23)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6992] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6997] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (aedd85d4-378a-4eb8-b14f-77b720fda39f)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.6998] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7001] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7003] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7004] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7006] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7017] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-timeout,ipv4.dhcp-client-id,ipv6.addr-gen-mode,ipv6.method,802-3-ethernet.mtu,connection.autoconnect-priority" pid=51738 uid=0 result="success"
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7019] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7022] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7023] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7036] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7040] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7043] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7046] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7052] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 kernel: ovs-system: entered promiscuous mode
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7058] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7063] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 systemd-udevd[51743]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7067] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7069] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7073] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 kernel: Timeout policy base is empty
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7078] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7081] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7084] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7088] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7092] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7096] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7098] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7102] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7106] dhcp4 (eth0): canceled DHCP transaction
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7106] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7107] dhcp4 (eth0): state changed no lease
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7108] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7118] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7127] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51738 uid=0 result="fail" reason="Device is not activated"
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7131] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 23 09:43:13 compute-2 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7138] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7145] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7152] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7154] dhcp4 (eth0): state changed new lease, address=38.129.56.185
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7204] device (eth1): disconnecting for new activation request.
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7205] audit: op="connection-activate" uuid="8b069e9e-bd63-5e9d-bdd1-b5c43b66b918" name="ci-private-network" pid=51738 uid=0 result="success"
Jan 23 09:43:13 compute-2 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7251] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51738 uid=0 result="success"
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7253] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7374] device (eth1): Activation: starting connection 'ci-private-network' (8b069e9e-bd63-5e9d-bdd1-b5c43b66b918)
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7388] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7392] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7398] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7399] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7401] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7402] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7403] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7405] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7406] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7414] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 kernel: br-ex: entered promiscuous mode
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7432] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7438] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7442] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7445] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 09:43:13 compute-2 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7448] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7451] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7454] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7458] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7461] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7464] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7468] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7471] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7474] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7478] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7484] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7490] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 kernel: vlan22: entered promiscuous mode
Jan 23 09:43:13 compute-2 systemd-udevd[51742]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7540] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7542] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7545] device (eth1): Activation: successful, device activated.
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7553] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7562] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7580] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7581] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7584] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 09:43:13 compute-2 kernel: vlan23: entered promiscuous mode
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7653] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7669] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 kernel: vlan21: entered promiscuous mode
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7695] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7698] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7703] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 09:43:13 compute-2 kernel: vlan20: entered promiscuous mode
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7772] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7792] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7794] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7812] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7859] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7863] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7865] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7869] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7875] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7881] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7886] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7898] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7940] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7942] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-2 NetworkManager[48964]: <info>  [1769161393.7948] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 09:43:14 compute-2 NetworkManager[48964]: <info>  [1769161394.9216] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51738 uid=0 result="success"
Jan 23 09:43:15 compute-2 NetworkManager[48964]: <info>  [1769161395.0858] checkpoint[0x5622b23f4950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 23 09:43:15 compute-2 NetworkManager[48964]: <info>  [1769161395.0860] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51738 uid=0 result="success"
Jan 23 09:43:15 compute-2 sudo[52094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbcfwymkecuzhvltbwrzftpdumrpboeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161394.6956189-842-270961981824076/AnsiballZ_async_status.py'
Jan 23 09:43:15 compute-2 sudo[52094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:15 compute-2 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 09:43:15 compute-2 python3.9[52096]: ansible-ansible.legacy.async_status Invoked with jid=j829007530849.51732 mode=status _async_dir=/root/.ansible_async
Jan 23 09:43:15 compute-2 sudo[52094]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:15 compute-2 NetworkManager[48964]: <info>  [1769161395.3730] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51738 uid=0 result="success"
Jan 23 09:43:15 compute-2 NetworkManager[48964]: <info>  [1769161395.3741] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51738 uid=0 result="success"
Jan 23 09:43:15 compute-2 NetworkManager[48964]: <info>  [1769161395.5919] audit: op="networking-control" arg="global-dns-configuration" pid=51738 uid=0 result="success"
Jan 23 09:43:15 compute-2 NetworkManager[48964]: <info>  [1769161395.5949] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 23 09:43:15 compute-2 NetworkManager[48964]: <info>  [1769161395.5975] audit: op="networking-control" arg="global-dns-configuration" pid=51738 uid=0 result="success"
Jan 23 09:43:15 compute-2 NetworkManager[48964]: <info>  [1769161395.6013] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51738 uid=0 result="success"
Jan 23 09:43:15 compute-2 NetworkManager[48964]: <info>  [1769161395.7465] checkpoint[0x5622b23f4a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 23 09:43:15 compute-2 NetworkManager[48964]: <info>  [1769161395.7471] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51738 uid=0 result="success"
Jan 23 09:43:15 compute-2 ansible-async_wrapper.py[51736]: Module complete (51736)
Jan 23 09:43:16 compute-2 ansible-async_wrapper.py[51735]: Done in kid B.
Jan 23 09:43:18 compute-2 sudo[52202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iikfyzrtejcmfibjggirqwzxpwuqiccr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161394.6956189-842-270961981824076/AnsiballZ_async_status.py'
Jan 23 09:43:18 compute-2 sudo[52202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:18 compute-2 python3.9[52204]: ansible-ansible.legacy.async_status Invoked with jid=j829007530849.51732 mode=status _async_dir=/root/.ansible_async
Jan 23 09:43:18 compute-2 sudo[52202]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:19 compute-2 sudo[52302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-biludavlqqmkrkwelntfnoahwkepkhoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161394.6956189-842-270961981824076/AnsiballZ_async_status.py'
Jan 23 09:43:19 compute-2 sudo[52302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:19 compute-2 python3.9[52304]: ansible-ansible.legacy.async_status Invoked with jid=j829007530849.51732 mode=cleanup _async_dir=/root/.ansible_async
Jan 23 09:43:19 compute-2 sudo[52302]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:20 compute-2 sudo[52454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqztkpryfmpnkmvghkjoxkmghsbfxtpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161399.7871294-923-126053006994693/AnsiballZ_stat.py'
Jan 23 09:43:20 compute-2 sudo[52454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:20 compute-2 python3.9[52456]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:43:20 compute-2 sudo[52454]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:20 compute-2 sudo[52577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnchzsxokeyspvqiohjyiohnclpsigys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161399.7871294-923-126053006994693/AnsiballZ_copy.py'
Jan 23 09:43:20 compute-2 sudo[52577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:20 compute-2 python3.9[52579]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161399.7871294-923-126053006994693/.source.returncode _original_basename=.f4fcr_xd follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:43:20 compute-2 sudo[52577]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:21 compute-2 sudo[52729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eatvnoudbltphmzdfcslbvvjwpvsehxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161401.1310925-972-253011611847754/AnsiballZ_stat.py'
Jan 23 09:43:21 compute-2 sudo[52729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:21 compute-2 python3.9[52731]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:43:21 compute-2 sudo[52729]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:21 compute-2 sudo[52852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukbuaeamezwrmntiydmkkcqggwgockue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161401.1310925-972-253011611847754/AnsiballZ_copy.py'
Jan 23 09:43:21 compute-2 sudo[52852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:22 compute-2 python3.9[52854]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161401.1310925-972-253011611847754/.source.cfg _original_basename=.rej6hpns follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:43:22 compute-2 sudo[52852]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:22 compute-2 sudo[53005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-capqtltbgdiqmxyqdyptabjqnazdimri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161402.4529037-1017-92183357577611/AnsiballZ_systemd.py'
Jan 23 09:43:22 compute-2 sudo[53005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:23 compute-2 python3.9[53007]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:43:23 compute-2 systemd[1]: Reloading Network Manager...
Jan 23 09:43:23 compute-2 NetworkManager[48964]: <info>  [1769161403.0936] audit: op="reload" arg="0" pid=53011 uid=0 result="success"
Jan 23 09:43:23 compute-2 NetworkManager[48964]: <info>  [1769161403.0944] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 23 09:43:23 compute-2 systemd[1]: Reloaded Network Manager.
Jan 23 09:43:23 compute-2 sudo[53005]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:23 compute-2 sshd-session[44967]: Connection closed by 192.168.122.30 port 45902
Jan 23 09:43:23 compute-2 sshd-session[44964]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:43:23 compute-2 systemd-logind[786]: Session 11 logged out. Waiting for processes to exit.
Jan 23 09:43:23 compute-2 systemd[1]: session-11.scope: Deactivated successfully.
Jan 23 09:43:23 compute-2 systemd[1]: session-11.scope: Consumed 53.740s CPU time.
Jan 23 09:43:23 compute-2 systemd-logind[786]: Removed session 11.
Jan 23 09:43:29 compute-2 sshd-session[53042]: Accepted publickey for zuul from 192.168.122.30 port 52504 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:43:29 compute-2 systemd-logind[786]: New session 12 of user zuul.
Jan 23 09:43:29 compute-2 systemd[1]: Started Session 12 of User zuul.
Jan 23 09:43:29 compute-2 sshd-session[53042]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:43:31 compute-2 python3.9[53195]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:43:32 compute-2 python3.9[53350]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:43:33 compute-2 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 09:43:33 compute-2 python3.9[53543]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:43:33 compute-2 sshd-session[53045]: Connection closed by 192.168.122.30 port 52504
Jan 23 09:43:33 compute-2 sshd-session[53042]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:43:33 compute-2 systemd[1]: session-12.scope: Deactivated successfully.
Jan 23 09:43:33 compute-2 systemd[1]: session-12.scope: Consumed 2.277s CPU time.
Jan 23 09:43:33 compute-2 systemd-logind[786]: Session 12 logged out. Waiting for processes to exit.
Jan 23 09:43:33 compute-2 systemd-logind[786]: Removed session 12.
Jan 23 09:43:39 compute-2 sshd-session[53572]: Accepted publickey for zuul from 192.168.122.30 port 39850 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:43:39 compute-2 systemd-logind[786]: New session 13 of user zuul.
Jan 23 09:43:39 compute-2 systemd[1]: Started Session 13 of User zuul.
Jan 23 09:43:39 compute-2 sshd-session[53572]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:43:40 compute-2 python3.9[53725]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:43:41 compute-2 python3.9[53879]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:43:42 compute-2 sudo[54034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jinlcbpyasfhqvinqpcsfrequefsxpfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161421.9610405-77-194507844608364/AnsiballZ_setup.py'
Jan 23 09:43:42 compute-2 sudo[54034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:42 compute-2 python3.9[54036]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:43:42 compute-2 sudo[54034]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:43 compute-2 sudo[54118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spudmdqkxvgwvrpjvilqtreznhclasyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161421.9610405-77-194507844608364/AnsiballZ_dnf.py'
Jan 23 09:43:43 compute-2 sudo[54118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:43 compute-2 python3.9[54120]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:43:45 compute-2 sudo[54118]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:45 compute-2 sudo[54272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysjvjsbulnrxuygknrjjypsspzfcyfji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161425.6769414-113-152825907008469/AnsiballZ_setup.py'
Jan 23 09:43:45 compute-2 sudo[54272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:46 compute-2 python3.9[54274]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:43:46 compute-2 sudo[54272]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:47 compute-2 sudo[54467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqdqvyvuzqsntkcuxmpvqniypnqlfkhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161426.9157832-146-102724023546529/AnsiballZ_file.py'
Jan 23 09:43:47 compute-2 sudo[54467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:47 compute-2 python3.9[54469]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:43:47 compute-2 sudo[54467]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:48 compute-2 sudo[54619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjbblzizmnpdyfpiwjatcqhkuosbrkkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161427.700809-171-257181612390473/AnsiballZ_command.py'
Jan 23 09:43:48 compute-2 sudo[54619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:48 compute-2 python3.9[54621]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:43:48 compute-2 systemd[1]: var-lib-containers-storage-overlay-compat2262428241-merged.mount: Deactivated successfully.
Jan 23 09:43:48 compute-2 podman[54622]: 2026-01-23 09:43:48.425324946 +0000 UTC m=+0.053176737 system refresh
Jan 23 09:43:48 compute-2 sudo[54619]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:49 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:43:49 compute-2 sudo[54782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvcuvfisaqgkzuioisomywvrmnktpscf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161428.754381-194-95758373487693/AnsiballZ_stat.py'
Jan 23 09:43:49 compute-2 sudo[54782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:49 compute-2 python3.9[54784]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:43:49 compute-2 sudo[54782]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:50 compute-2 sudo[54905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imaiyupjxrwergnpohvqrbininbccxqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161428.754381-194-95758373487693/AnsiballZ_copy.py'
Jan 23 09:43:50 compute-2 sudo[54905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:50 compute-2 python3.9[54907]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161428.754381-194-95758373487693/.source.json follow=False _original_basename=podman_network_config.j2 checksum=1b0be18864a1e74e2095b155999887790d126e9d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:43:50 compute-2 sudo[54905]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:50 compute-2 sudo[55057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odffiwwiujwgdkdbagrescgbeecbknae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161430.5504313-240-19061006517870/AnsiballZ_stat.py'
Jan 23 09:43:50 compute-2 sudo[55057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:51 compute-2 python3.9[55059]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:43:51 compute-2 sudo[55057]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:51 compute-2 sudo[55180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyaqmguqylmaelsbfhofgenefhnzelga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161430.5504313-240-19061006517870/AnsiballZ_copy.py'
Jan 23 09:43:51 compute-2 sudo[55180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:51 compute-2 python3.9[55182]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769161430.5504313-240-19061006517870/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:43:51 compute-2 sudo[55180]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:52 compute-2 sudo[55332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwxazvejnmkncacwxsyvqmizrbmlrpun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161431.976042-287-199744753458680/AnsiballZ_ini_file.py'
Jan 23 09:43:52 compute-2 sudo[55332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:52 compute-2 python3.9[55334]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:43:52 compute-2 sudo[55332]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:52 compute-2 sudo[55484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgmimkszckftcbljeehamqydpampgjbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161432.7204595-287-95571669206059/AnsiballZ_ini_file.py'
Jan 23 09:43:52 compute-2 sudo[55484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:53 compute-2 python3.9[55486]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:43:53 compute-2 sudo[55484]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:53 compute-2 sudo[55636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgewoxupehpilpirqthrkdpegclikrnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161433.312915-287-235747404939138/AnsiballZ_ini_file.py'
Jan 23 09:43:53 compute-2 sudo[55636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:53 compute-2 python3.9[55638]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:43:53 compute-2 sudo[55636]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:54 compute-2 sudo[55788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjjhzxadvwpxyrxemstkwoszcnmmnxzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161433.9196188-287-40682279585406/AnsiballZ_ini_file.py'
Jan 23 09:43:54 compute-2 sudo[55788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:54 compute-2 python3.9[55790]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:43:54 compute-2 sudo[55788]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:55 compute-2 sudo[55940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frhycbkokiqaaxulejpwtjwvoehqedgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161434.8801925-380-267246331587639/AnsiballZ_dnf.py'
Jan 23 09:43:55 compute-2 sudo[55940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:55 compute-2 python3.9[55942]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:43:56 compute-2 sudo[55940]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:57 compute-2 sudo[56093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zecqrfabigniwjrwfojvdzxyqfdsflpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161437.4032784-413-95286333034678/AnsiballZ_setup.py'
Jan 23 09:43:57 compute-2 sudo[56093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:57 compute-2 python3.9[56095]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:43:58 compute-2 sudo[56093]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:58 compute-2 sudo[56247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skqvmvgysrzbsmmzijlovcnmhdsgibrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161438.261506-437-176091212292885/AnsiballZ_stat.py'
Jan 23 09:43:58 compute-2 sudo[56247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:58 compute-2 python3.9[56249]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:43:58 compute-2 sudo[56247]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:59 compute-2 sudo[56399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmbdekwdiipkuxbqfnjncnehdxdsereg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161439.0134485-464-156201681568130/AnsiballZ_stat.py'
Jan 23 09:43:59 compute-2 sudo[56399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:59 compute-2 python3.9[56401]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:43:59 compute-2 sudo[56399]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:00 compute-2 sudo[56551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loxjhtfdilqionthignqnnrdkjeaigmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161439.8083725-494-160191980370084/AnsiballZ_command.py'
Jan 23 09:44:00 compute-2 sudo[56551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:00 compute-2 python3.9[56553]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:44:00 compute-2 sudo[56551]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:01 compute-2 sudo[56704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxitizqfevgtbpbtbhreoslfxnerufbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161440.5976539-525-136130998244228/AnsiballZ_service_facts.py'
Jan 23 09:44:01 compute-2 sudo[56704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:01 compute-2 python3.9[56706]: ansible-service_facts Invoked
Jan 23 09:44:01 compute-2 network[56723]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 09:44:01 compute-2 network[56724]: 'network-scripts' will be removed from distribution in near future.
Jan 23 09:44:01 compute-2 network[56725]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 09:44:04 compute-2 sudo[56704]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:05 compute-2 sudo[57008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgiifndfacnndlebwiacrslbmsuukdmz ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1769161445.0369146-570-45093053283916/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1769161445.0369146-570-45093053283916/args'
Jan 23 09:44:05 compute-2 sudo[57008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:05 compute-2 sudo[57008]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:05 compute-2 sudo[57175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljrxykmkumubqqpghidnnbtbdklvouwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161445.7225785-603-240586029247020/AnsiballZ_dnf.py'
Jan 23 09:44:05 compute-2 sudo[57175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:06 compute-2 python3.9[57177]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:44:08 compute-2 sudo[57175]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:10 compute-2 sudo[57328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwavhmhflfcfnnnlpitkiuohnalhzekp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161450.11237-642-187985959593872/AnsiballZ_package_facts.py'
Jan 23 09:44:10 compute-2 sudo[57328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:10 compute-2 python3.9[57330]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 23 09:44:11 compute-2 sudo[57328]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:12 compute-2 sudo[57480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsujlaszwgjqrpfpoybuupyqwburojzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161451.9298615-673-77630030882417/AnsiballZ_stat.py'
Jan 23 09:44:12 compute-2 sudo[57480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:12 compute-2 python3.9[57482]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:44:12 compute-2 sudo[57480]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:12 compute-2 sudo[57605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exoqzkqmrucfwqqxufmvwrcfpjhqqhvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161451.9298615-673-77630030882417/AnsiballZ_copy.py'
Jan 23 09:44:12 compute-2 sudo[57605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:12 compute-2 python3.9[57607]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161451.9298615-673-77630030882417/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:44:12 compute-2 sudo[57605]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:13 compute-2 sudo[57759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpijjuejktlaxntsbwlwdrlekwjffdji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161453.290572-717-179379001969501/AnsiballZ_stat.py'
Jan 23 09:44:13 compute-2 sudo[57759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:13 compute-2 python3.9[57761]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:44:13 compute-2 sudo[57759]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:14 compute-2 sudo[57884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zitdcpffgedcuxekzwbmmuwtrczssxfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161453.290572-717-179379001969501/AnsiballZ_copy.py'
Jan 23 09:44:14 compute-2 sudo[57884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:14 compute-2 python3.9[57886]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161453.290572-717-179379001969501/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:44:14 compute-2 sudo[57884]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:15 compute-2 sudo[58038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msfypjbrwqvjqdjqyowxiysomphxakqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161455.5880625-782-241624768512924/AnsiballZ_lineinfile.py'
Jan 23 09:44:15 compute-2 sudo[58038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:16 compute-2 python3.9[58040]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:44:16 compute-2 sudo[58038]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:17 compute-2 sudo[58192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnprsocqzqqmbyxjpvkximuetpojhvxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161457.4964283-826-236745841915648/AnsiballZ_setup.py'
Jan 23 09:44:17 compute-2 sudo[58192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:18 compute-2 python3.9[58194]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:44:18 compute-2 sudo[58192]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:18 compute-2 sudo[58276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiluythboevlclddfajlznrtkachhfuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161457.4964283-826-236745841915648/AnsiballZ_systemd.py'
Jan 23 09:44:18 compute-2 sudo[58276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:19 compute-2 python3.9[58278]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:44:19 compute-2 sudo[58276]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:20 compute-2 sudo[58430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eebguxpvjlbxahcvpzicpajprmeqjrge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161460.2666779-874-10733309184757/AnsiballZ_setup.py'
Jan 23 09:44:20 compute-2 sudo[58430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:20 compute-2 python3.9[58432]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:44:21 compute-2 sudo[58430]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:21 compute-2 sudo[58514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yclgfzzctjpvyjthhrqnexcckdehspzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161460.2666779-874-10733309184757/AnsiballZ_systemd.py'
Jan 23 09:44:21 compute-2 sudo[58514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:21 compute-2 python3.9[58516]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:44:21 compute-2 chronyd[794]: chronyd exiting
Jan 23 09:44:21 compute-2 systemd[1]: Stopping NTP client/server...
Jan 23 09:44:21 compute-2 systemd[1]: chronyd.service: Deactivated successfully.
Jan 23 09:44:21 compute-2 systemd[1]: Stopped NTP client/server.
Jan 23 09:44:21 compute-2 systemd[1]: Starting NTP client/server...
Jan 23 09:44:21 compute-2 chronyd[58525]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 23 09:44:21 compute-2 chronyd[58525]: Frequency -23.658 +/- 0.241 ppm read from /var/lib/chrony/drift
Jan 23 09:44:21 compute-2 chronyd[58525]: Loaded seccomp filter (level 2)
Jan 23 09:44:21 compute-2 systemd[1]: Started NTP client/server.
Jan 23 09:44:21 compute-2 sudo[58514]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:22 compute-2 sshd-session[53575]: Connection closed by 192.168.122.30 port 39850
Jan 23 09:44:22 compute-2 sshd-session[53572]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:44:22 compute-2 systemd[1]: session-13.scope: Deactivated successfully.
Jan 23 09:44:22 compute-2 systemd[1]: session-13.scope: Consumed 25.790s CPU time.
Jan 23 09:44:22 compute-2 systemd-logind[786]: Session 13 logged out. Waiting for processes to exit.
Jan 23 09:44:22 compute-2 systemd-logind[786]: Removed session 13.
Jan 23 09:44:29 compute-2 sshd-session[58551]: Accepted publickey for zuul from 192.168.122.30 port 37570 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:44:29 compute-2 systemd-logind[786]: New session 14 of user zuul.
Jan 23 09:44:29 compute-2 systemd[1]: Started Session 14 of User zuul.
Jan 23 09:44:29 compute-2 sshd-session[58551]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:44:29 compute-2 sudo[58704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgxelerlyylfhpmxlfgauzndgerdtfvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161469.2981293-23-275908610071516/AnsiballZ_file.py'
Jan 23 09:44:29 compute-2 sudo[58704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:29 compute-2 python3.9[58706]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:44:29 compute-2 sudo[58704]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:30 compute-2 sudo[58856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pucjgvfsaubfjntraunbpromyvpcxtko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161470.1658196-59-272084309555168/AnsiballZ_stat.py'
Jan 23 09:44:30 compute-2 sudo[58856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:30 compute-2 python3.9[58858]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:44:30 compute-2 sudo[58856]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:31 compute-2 sudo[58979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dryzfuojydytmkrheuhltszhdlzfuneo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161470.1658196-59-272084309555168/AnsiballZ_copy.py'
Jan 23 09:44:31 compute-2 sudo[58979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:31 compute-2 python3.9[58981]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161470.1658196-59-272084309555168/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:44:31 compute-2 sudo[58979]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:31 compute-2 sshd-session[58554]: Connection closed by 192.168.122.30 port 37570
Jan 23 09:44:31 compute-2 sshd-session[58551]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:44:31 compute-2 systemd[1]: session-14.scope: Deactivated successfully.
Jan 23 09:44:31 compute-2 systemd[1]: session-14.scope: Consumed 1.566s CPU time.
Jan 23 09:44:31 compute-2 systemd-logind[786]: Session 14 logged out. Waiting for processes to exit.
Jan 23 09:44:31 compute-2 systemd-logind[786]: Removed session 14.
Jan 23 09:44:37 compute-2 sshd-session[59006]: Accepted publickey for zuul from 192.168.122.30 port 52544 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:44:37 compute-2 systemd-logind[786]: New session 15 of user zuul.
Jan 23 09:44:37 compute-2 systemd[1]: Started Session 15 of User zuul.
Jan 23 09:44:37 compute-2 sshd-session[59006]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:44:38 compute-2 python3.9[59159]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:44:39 compute-2 sudo[59313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aagsjsryznioaawbblaxvvurghxaaykd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161479.321818-56-172971099610195/AnsiballZ_file.py'
Jan 23 09:44:39 compute-2 sudo[59313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:39 compute-2 python3.9[59315]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:44:39 compute-2 sudo[59313]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:40 compute-2 sudo[59488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgzejziwibghwxolbrnafrbflkksagkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161480.1717734-80-277237372158404/AnsiballZ_stat.py'
Jan 23 09:44:40 compute-2 sudo[59488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:40 compute-2 python3.9[59490]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:44:40 compute-2 sudo[59488]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:41 compute-2 sudo[59611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyexbieazdetxhkhgatnzkavqbzswtgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161480.1717734-80-277237372158404/AnsiballZ_copy.py'
Jan 23 09:44:41 compute-2 sudo[59611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:41 compute-2 python3.9[59613]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769161480.1717734-80-277237372158404/.source.json _original_basename=.maafkbe6 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:44:41 compute-2 sudo[59611]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:42 compute-2 sudo[59763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttxdzmwlnmjijqryojptqgekbaxpznhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161482.0350761-149-193192911953963/AnsiballZ_stat.py'
Jan 23 09:44:42 compute-2 sudo[59763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:42 compute-2 python3.9[59765]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:44:42 compute-2 sudo[59763]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:42 compute-2 sudo[59886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynbmtzrlblfcglrbjaqxgsfbmlhongez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161482.0350761-149-193192911953963/AnsiballZ_copy.py'
Jan 23 09:44:42 compute-2 sudo[59886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:42 compute-2 python3.9[59888]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161482.0350761-149-193192911953963/.source _original_basename=.t83b5tsx follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:44:43 compute-2 sudo[59886]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:43 compute-2 sudo[60038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajvqeetddsprgtyjfiaeqcehwccdsyvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161483.4156318-198-61068653966982/AnsiballZ_file.py'
Jan 23 09:44:43 compute-2 sudo[60038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:43 compute-2 python3.9[60040]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:44:43 compute-2 sudo[60038]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:44 compute-2 sudo[60190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edbswlgbbpgzcqwvduldudvmndwdichr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161484.1471531-222-172212484964514/AnsiballZ_stat.py'
Jan 23 09:44:44 compute-2 sudo[60190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:44 compute-2 python3.9[60192]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:44:44 compute-2 sudo[60190]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:44 compute-2 sudo[60313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhgwltqcbbmvujpkslpgqeoeadkjbtsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161484.1471531-222-172212484964514/AnsiballZ_copy.py'
Jan 23 09:44:44 compute-2 sudo[60313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:45 compute-2 python3.9[60315]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769161484.1471531-222-172212484964514/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:44:45 compute-2 sudo[60313]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:45 compute-2 sudo[60465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xokwwpnbbjhjospyiszffzmbflfhgske ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161485.275144-222-125022384232886/AnsiballZ_stat.py'
Jan 23 09:44:45 compute-2 sudo[60465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:45 compute-2 python3.9[60467]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:44:45 compute-2 sudo[60465]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:45 compute-2 sudo[60588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czlrqxoembvmbqnzwpzgryugfwkvrdxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161485.275144-222-125022384232886/AnsiballZ_copy.py'
Jan 23 09:44:45 compute-2 sudo[60588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:46 compute-2 python3.9[60590]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769161485.275144-222-125022384232886/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:44:46 compute-2 sudo[60588]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:47 compute-2 sudo[60740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgnwmqmflqgozoqftlvnsjicpuusppfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161487.0989525-308-241336651811026/AnsiballZ_file.py'
Jan 23 09:44:47 compute-2 sudo[60740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:47 compute-2 python3.9[60742]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:44:47 compute-2 sudo[60740]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:48 compute-2 sudo[60892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwsploatskavrlcqsreuvswvsdgvbomr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161487.7556248-332-21926245423317/AnsiballZ_stat.py'
Jan 23 09:44:48 compute-2 sudo[60892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:48 compute-2 python3.9[60894]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:44:48 compute-2 sudo[60892]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:48 compute-2 sudo[61015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkcdhlyysbqvyyfepyajcqgzbdknoxaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161487.7556248-332-21926245423317/AnsiballZ_copy.py'
Jan 23 09:44:48 compute-2 sudo[61015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:48 compute-2 python3.9[61017]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161487.7556248-332-21926245423317/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:44:48 compute-2 sudo[61015]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:49 compute-2 sudo[61167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzkbejhcmfuiyjjzlfckzsinaldklgfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161488.9803607-378-93908684126056/AnsiballZ_stat.py'
Jan 23 09:44:49 compute-2 sudo[61167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:49 compute-2 python3.9[61169]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:44:49 compute-2 sudo[61167]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:49 compute-2 sudo[61290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmwatxbulksaswwljrifjpiqpdpixgod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161488.9803607-378-93908684126056/AnsiballZ_copy.py'
Jan 23 09:44:49 compute-2 sudo[61290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:50 compute-2 python3.9[61292]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161488.9803607-378-93908684126056/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:44:50 compute-2 sudo[61290]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:50 compute-2 sudo[61442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfptueyyepczvedvozzxondbfuejjddx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161490.197795-423-175464494034644/AnsiballZ_systemd.py'
Jan 23 09:44:50 compute-2 sudo[61442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:51 compute-2 python3.9[61444]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:44:51 compute-2 systemd[1]: Reloading.
Jan 23 09:44:51 compute-2 systemd-rc-local-generator[61469]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:44:51 compute-2 systemd-sysv-generator[61474]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:44:51 compute-2 systemd[1]: Reloading.
Jan 23 09:44:51 compute-2 systemd-rc-local-generator[61509]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:44:51 compute-2 systemd-sysv-generator[61512]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:44:51 compute-2 systemd[1]: Starting EDPM Container Shutdown...
Jan 23 09:44:51 compute-2 systemd[1]: Finished EDPM Container Shutdown.
Jan 23 09:44:51 compute-2 sudo[61442]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:52 compute-2 sudo[61670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-memrwxdyeqvjnjkfzdmrkkvmrlefhpkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161491.9617503-447-196257357072030/AnsiballZ_stat.py'
Jan 23 09:44:52 compute-2 sudo[61670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:52 compute-2 python3.9[61672]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:44:52 compute-2 sudo[61670]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:52 compute-2 sudo[61793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogwjkizssawhavdtdcauorpdwatrgiex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161491.9617503-447-196257357072030/AnsiballZ_copy.py'
Jan 23 09:44:52 compute-2 sudo[61793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:52 compute-2 python3.9[61795]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161491.9617503-447-196257357072030/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:44:52 compute-2 sudo[61793]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:53 compute-2 sudo[61945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxsvbchkjjxtwjayckuurvqianxvznow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161493.2706807-492-1817716569185/AnsiballZ_stat.py'
Jan 23 09:44:53 compute-2 sudo[61945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:53 compute-2 python3.9[61947]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:44:53 compute-2 sudo[61945]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:54 compute-2 sudo[62068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpbngmnumabaynioelvtdkoekybehmqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161493.2706807-492-1817716569185/AnsiballZ_copy.py'
Jan 23 09:44:54 compute-2 sudo[62068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:54 compute-2 python3.9[62070]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161493.2706807-492-1817716569185/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:44:54 compute-2 sudo[62068]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:55 compute-2 sudo[62220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bipktqdnilarzbzwdehglyxjgvyaevyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161494.7709677-537-66206398487327/AnsiballZ_systemd.py'
Jan 23 09:44:55 compute-2 sudo[62220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:55 compute-2 python3.9[62222]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:44:55 compute-2 systemd[1]: Reloading.
Jan 23 09:44:55 compute-2 systemd-rc-local-generator[62246]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:44:55 compute-2 systemd-sysv-generator[62249]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:44:55 compute-2 systemd[1]: Reloading.
Jan 23 09:44:55 compute-2 systemd-sysv-generator[62290]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:44:55 compute-2 systemd-rc-local-generator[62286]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:44:55 compute-2 systemd[1]: Starting Create netns directory...
Jan 23 09:44:55 compute-2 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 09:44:55 compute-2 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 09:44:55 compute-2 systemd[1]: Finished Create netns directory.
Jan 23 09:44:55 compute-2 sudo[62220]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:56 compute-2 python3.9[62449]: ansible-ansible.builtin.service_facts Invoked
Jan 23 09:44:56 compute-2 network[62466]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 09:44:56 compute-2 network[62467]: 'network-scripts' will be removed from distribution in near future.
Jan 23 09:44:56 compute-2 network[62468]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 09:45:01 compute-2 sudo[62728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gajapnlrshqhnbocqysrjpumgejtjfwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161500.9012642-584-121551298467113/AnsiballZ_systemd.py'
Jan 23 09:45:01 compute-2 sudo[62728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:01 compute-2 python3.9[62730]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:45:01 compute-2 systemd[1]: Reloading.
Jan 23 09:45:01 compute-2 systemd-rc-local-generator[62758]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:45:01 compute-2 systemd-sysv-generator[62761]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:45:01 compute-2 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 23 09:45:02 compute-2 iptables.init[62770]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 23 09:45:02 compute-2 iptables.init[62770]: iptables: Flushing firewall rules: [  OK  ]
Jan 23 09:45:02 compute-2 systemd[1]: iptables.service: Deactivated successfully.
Jan 23 09:45:02 compute-2 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 23 09:45:02 compute-2 sudo[62728]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:02 compute-2 sudo[62964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlxaeigczrjqawetgomniskgvzopfgof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161502.3769562-584-214739810691394/AnsiballZ_systemd.py'
Jan 23 09:45:02 compute-2 sudo[62964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:02 compute-2 python3.9[62966]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:45:03 compute-2 sudo[62964]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:03 compute-2 sudo[63118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyrecthohfvsiibsroonydikimrdhbjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161503.3366866-632-113464239162438/AnsiballZ_systemd.py'
Jan 23 09:45:03 compute-2 sudo[63118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:03 compute-2 python3.9[63120]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:45:03 compute-2 systemd[1]: Reloading.
Jan 23 09:45:04 compute-2 systemd-rc-local-generator[63149]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:45:04 compute-2 systemd-sysv-generator[63153]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:45:04 compute-2 systemd[1]: Starting Netfilter Tables...
Jan 23 09:45:04 compute-2 systemd[1]: Finished Netfilter Tables.
Jan 23 09:45:04 compute-2 sudo[63118]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:05 compute-2 sudo[63309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djkaxyvjwiqdwggyxtifajcczkidjoba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161504.587962-656-92316396759960/AnsiballZ_command.py'
Jan 23 09:45:05 compute-2 sudo[63309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:05 compute-2 python3.9[63311]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:45:05 compute-2 sudo[63309]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:06 compute-2 sudo[63462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bidyxjvpkymbkayaqbisjfwldwpkotru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161506.1639-699-262750373175026/AnsiballZ_stat.py'
Jan 23 09:45:06 compute-2 sudo[63462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:06 compute-2 python3.9[63464]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:45:06 compute-2 sudo[63462]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:06 compute-2 sudo[63587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kopwiarykjqybdnttwckzbtiiqhbnliv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161506.1639-699-262750373175026/AnsiballZ_copy.py'
Jan 23 09:45:06 compute-2 sudo[63587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:07 compute-2 python3.9[63589]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161506.1639-699-262750373175026/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:07 compute-2 sudo[63587]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:07 compute-2 sudo[63740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnwoarsvvdvymldpmdblnosaupkmuvvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161507.4519374-744-96429226287463/AnsiballZ_systemd.py'
Jan 23 09:45:07 compute-2 sudo[63740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:08 compute-2 python3.9[63742]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:45:08 compute-2 systemd[1]: Reloading OpenSSH server daemon...
Jan 23 09:45:08 compute-2 sshd[1005]: Received SIGHUP; restarting.
Jan 23 09:45:08 compute-2 systemd[1]: Reloaded OpenSSH server daemon.
Jan 23 09:45:08 compute-2 sshd[1005]: Server listening on 0.0.0.0 port 22.
Jan 23 09:45:08 compute-2 sshd[1005]: Server listening on :: port 22.
Jan 23 09:45:08 compute-2 sudo[63740]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:08 compute-2 sudo[63896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybkazkuttjpkgjpsnpcprnqwxouxwuit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161508.317516-768-177646777579072/AnsiballZ_file.py'
Jan 23 09:45:08 compute-2 sudo[63896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:08 compute-2 python3.9[63898]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:08 compute-2 sudo[63896]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:09 compute-2 sudo[64048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lszkhdykptmgmlurhfeggiulocfdluxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161509.030427-792-221927825154635/AnsiballZ_stat.py'
Jan 23 09:45:09 compute-2 sudo[64048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:09 compute-2 python3.9[64050]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:45:09 compute-2 sudo[64048]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:09 compute-2 sudo[64171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvyezzdkdkdtpayptzgyjuocklrqczyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161509.030427-792-221927825154635/AnsiballZ_copy.py'
Jan 23 09:45:09 compute-2 sudo[64171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:10 compute-2 python3.9[64173]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161509.030427-792-221927825154635/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:10 compute-2 sudo[64171]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:10 compute-2 sudo[64323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnzszdqdmbeuzzbftzztmkqssjpulkbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161510.557117-846-52924809141563/AnsiballZ_timezone.py'
Jan 23 09:45:10 compute-2 sudo[64323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:11 compute-2 python3.9[64325]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 23 09:45:11 compute-2 systemd[1]: Starting Time & Date Service...
Jan 23 09:45:11 compute-2 systemd[1]: Started Time & Date Service.
Jan 23 09:45:11 compute-2 sudo[64323]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:11 compute-2 sudo[64479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enwkfuysnjcccvysrjupzncbxxfkgscw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161511.5611498-873-174178637274242/AnsiballZ_file.py'
Jan 23 09:45:11 compute-2 sudo[64479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:12 compute-2 python3.9[64481]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:12 compute-2 sudo[64479]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:13 compute-2 sudo[64631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqmyymfjbchsbrzmjkkcpnzbrudjpunv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161513.166487-897-241666304338393/AnsiballZ_stat.py'
Jan 23 09:45:13 compute-2 sudo[64631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:13 compute-2 python3.9[64633]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:45:13 compute-2 sudo[64631]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:13 compute-2 sudo[64754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jomdtopzzjgufiaworuaiqjzeoerpwmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161513.166487-897-241666304338393/AnsiballZ_copy.py'
Jan 23 09:45:13 compute-2 sudo[64754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:14 compute-2 python3.9[64756]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161513.166487-897-241666304338393/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:14 compute-2 sudo[64754]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:14 compute-2 sudo[64906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwketytphutlaghawetjpmeajtoclrjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161514.383783-942-133691894704269/AnsiballZ_stat.py'
Jan 23 09:45:14 compute-2 sudo[64906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:14 compute-2 python3.9[64908]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:45:14 compute-2 sudo[64906]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:15 compute-2 sudo[65029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtzbkaiemxrjmpxryqcmjrmyoystazkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161514.383783-942-133691894704269/AnsiballZ_copy.py'
Jan 23 09:45:15 compute-2 sudo[65029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:15 compute-2 python3.9[65031]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161514.383783-942-133691894704269/.source.yaml _original_basename=.p08ujq7o follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:15 compute-2 sudo[65029]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:15 compute-2 sudo[65181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqwibaziielheoabgamgehmbiacsivdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161515.591409-987-162579262437420/AnsiballZ_stat.py'
Jan 23 09:45:15 compute-2 sudo[65181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:16 compute-2 python3.9[65183]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:45:16 compute-2 sudo[65181]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:16 compute-2 sudo[65304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nseybrgnixbahacqdbncfdxlvurtstyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161515.591409-987-162579262437420/AnsiballZ_copy.py'
Jan 23 09:45:16 compute-2 sudo[65304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:16 compute-2 python3.9[65306]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161515.591409-987-162579262437420/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:16 compute-2 sudo[65304]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:17 compute-2 sudo[65456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eosnjqsgnurbzajrzlhviczbrmktmquk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161517.136341-1032-87449273282257/AnsiballZ_command.py'
Jan 23 09:45:17 compute-2 sudo[65456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:17 compute-2 python3.9[65458]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:45:17 compute-2 sudo[65456]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:18 compute-2 sudo[65609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krlxujvmiussiyxpudlejhctsiodcjbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161517.9160752-1056-213677147477547/AnsiballZ_command.py'
Jan 23 09:45:18 compute-2 sudo[65609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:18 compute-2 python3.9[65611]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:45:18 compute-2 sudo[65609]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:19 compute-2 sudo[65762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nntcflqjjprgrtlhyijsvmsrhrbftdiy ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769161518.6187377-1080-225854508165505/AnsiballZ_edpm_nftables_from_files.py'
Jan 23 09:45:19 compute-2 sudo[65762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:19 compute-2 python3[65764]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 09:45:19 compute-2 sudo[65762]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:19 compute-2 sudo[65914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvfhjxwxkhharrplawadoqtqtuggnoqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161519.5056405-1104-257627253122521/AnsiballZ_stat.py'
Jan 23 09:45:19 compute-2 sudo[65914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:19 compute-2 python3.9[65916]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:45:20 compute-2 sudo[65914]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:20 compute-2 sudo[66037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvyiswnjujaiszkhwkwqftebqpavmytq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161519.5056405-1104-257627253122521/AnsiballZ_copy.py'
Jan 23 09:45:20 compute-2 sudo[66037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:20 compute-2 python3.9[66039]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161519.5056405-1104-257627253122521/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:20 compute-2 sudo[66037]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:21 compute-2 sudo[66189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcyozatisoxlbwyohckqzqviflpfdpst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161520.7699082-1149-45393942645795/AnsiballZ_stat.py'
Jan 23 09:45:21 compute-2 sudo[66189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:21 compute-2 python3.9[66191]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:45:21 compute-2 sudo[66189]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:21 compute-2 sudo[66312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzbvswxlhccqroolfpnuearkoaecexba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161520.7699082-1149-45393942645795/AnsiballZ_copy.py'
Jan 23 09:45:21 compute-2 sudo[66312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:21 compute-2 python3.9[66314]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161520.7699082-1149-45393942645795/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:21 compute-2 sudo[66312]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:22 compute-2 sudo[66464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isncdhuvlphpaiwmnzfxnblvjlgzgmkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161522.029869-1194-198351354581833/AnsiballZ_stat.py'
Jan 23 09:45:22 compute-2 sudo[66464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:22 compute-2 python3.9[66466]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:45:22 compute-2 sudo[66464]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:22 compute-2 sudo[66587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbtinhszbhpcoevykxhzgahilybhquwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161522.029869-1194-198351354581833/AnsiballZ_copy.py'
Jan 23 09:45:22 compute-2 sudo[66587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:23 compute-2 python3.9[66589]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161522.029869-1194-198351354581833/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:23 compute-2 sudo[66587]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:23 compute-2 sudo[66739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqqtgwkjfycxrmuxfllxfgfibiqewqub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161523.325557-1239-266791836193569/AnsiballZ_stat.py'
Jan 23 09:45:23 compute-2 sudo[66739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:23 compute-2 python3.9[66741]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:45:23 compute-2 sudo[66739]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:24 compute-2 sudo[66862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbahzdaaybxszawrdqrgqxjzgyjdwbhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161523.325557-1239-266791836193569/AnsiballZ_copy.py'
Jan 23 09:45:24 compute-2 sudo[66862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:24 compute-2 python3.9[66864]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161523.325557-1239-266791836193569/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:24 compute-2 sudo[66862]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:25 compute-2 sudo[67014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrakyuohrjjffgiywtqwhgbnynnzuugy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161524.6178775-1284-39467673817560/AnsiballZ_stat.py'
Jan 23 09:45:25 compute-2 sudo[67014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:25 compute-2 python3.9[67016]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:45:25 compute-2 sudo[67014]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:25 compute-2 sudo[67137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imserbptgkuoznprsrwvnqovxxunwyrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161524.6178775-1284-39467673817560/AnsiballZ_copy.py'
Jan 23 09:45:25 compute-2 sudo[67137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:25 compute-2 python3.9[67139]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161524.6178775-1284-39467673817560/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:25 compute-2 sudo[67137]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:26 compute-2 sudo[67289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihuruftqhdutyvxxcubesvcncjwstgmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161525.9735053-1329-244942817201011/AnsiballZ_file.py'
Jan 23 09:45:26 compute-2 sudo[67289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:26 compute-2 python3.9[67291]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:26 compute-2 sudo[67289]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:26 compute-2 sudo[67441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ookhkwgslbhuyqfyihmbxvznfvxekfvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161526.6732743-1353-275831608019670/AnsiballZ_command.py'
Jan 23 09:45:26 compute-2 sudo[67441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:27 compute-2 python3.9[67443]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:45:27 compute-2 sudo[67441]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:27 compute-2 sudo[67600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdjgyvupwglowsnaikbsumjdedlkhsvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161527.4765334-1377-218376752526175/AnsiballZ_blockinfile.py'
Jan 23 09:45:27 compute-2 sudo[67600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:28 compute-2 python3.9[67602]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:28 compute-2 sudo[67600]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:28 compute-2 sudo[67753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wesvxxrxkwwtgoyvpdufczxoguvvtpev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161528.4125347-1404-192604515540060/AnsiballZ_file.py'
Jan 23 09:45:28 compute-2 sudo[67753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:29 compute-2 python3.9[67755]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:29 compute-2 sudo[67753]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:29 compute-2 sudo[67905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akqsezptqpcmdknntvhcdunoiswmxxqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161529.2701724-1404-158895735165907/AnsiballZ_file.py'
Jan 23 09:45:29 compute-2 sudo[67905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:29 compute-2 python3.9[67907]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:29 compute-2 sudo[67905]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:30 compute-2 sudo[68057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssbshislfnhqkrzclkaihnzmamyvcitx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161529.9864218-1449-188402332441536/AnsiballZ_mount.py'
Jan 23 09:45:30 compute-2 sudo[68057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:30 compute-2 python3.9[68059]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 23 09:45:30 compute-2 sudo[68057]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:31 compute-2 sudo[68210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byyjcorifmlfsbixmxkjbxnlkqducpre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161530.890887-1449-78015753106599/AnsiballZ_mount.py'
Jan 23 09:45:31 compute-2 sudo[68210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:31 compute-2 python3.9[68212]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 23 09:45:31 compute-2 sudo[68210]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:31 compute-2 sshd-session[59009]: Connection closed by 192.168.122.30 port 52544
Jan 23 09:45:31 compute-2 sshd-session[59006]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:45:31 compute-2 systemd[1]: session-15.scope: Deactivated successfully.
Jan 23 09:45:31 compute-2 systemd[1]: session-15.scope: Consumed 34.191s CPU time.
Jan 23 09:45:31 compute-2 systemd-logind[786]: Session 15 logged out. Waiting for processes to exit.
Jan 23 09:45:31 compute-2 systemd-logind[786]: Removed session 15.
Jan 23 09:45:37 compute-2 sshd-session[68238]: Accepted publickey for zuul from 192.168.122.30 port 56128 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:45:37 compute-2 systemd-logind[786]: New session 16 of user zuul.
Jan 23 09:45:37 compute-2 systemd[1]: Started Session 16 of User zuul.
Jan 23 09:45:37 compute-2 sshd-session[68238]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:45:38 compute-2 sudo[68391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxwnakpeccfvsiamrmaedshksjdrbtmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161537.5814369-20-141306896820579/AnsiballZ_tempfile.py'
Jan 23 09:45:38 compute-2 sudo[68391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:38 compute-2 python3.9[68393]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 23 09:45:38 compute-2 sudo[68391]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:38 compute-2 sudo[68543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjyxewcdjywitceylfjsoldbruwecbho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161538.4041367-57-264489451124781/AnsiballZ_stat.py'
Jan 23 09:45:38 compute-2 sudo[68543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:38 compute-2 python3.9[68545]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:45:39 compute-2 sudo[68543]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:39 compute-2 sudo[68695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntyzlmvhlkoptkfhsgsqxzzonygoiupn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161539.3863478-86-151363150805600/AnsiballZ_setup.py'
Jan 23 09:45:39 compute-2 sudo[68695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:40 compute-2 python3.9[68697]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:45:40 compute-2 sudo[68695]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:41 compute-2 sudo[68847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlrfvdydokntjnyxvfejyxaopxlmhqhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161540.5853243-111-207465836274621/AnsiballZ_blockinfile.py'
Jan 23 09:45:41 compute-2 sudo[68847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:41 compute-2 python3.9[68849]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC+cj2so8SS29oYZ1K+7e02qi6fVkGXJzGMkIN9mgJPLCBtQ6vpBYEObTZZXuMIHhdiMUAp6RDjs11OXDkAB9R7e2ncjMKn7J2EHbmceT7rNq9L0w+QaLKFxl+xdJQ9QtO9ioNgJFXXQZt/IOeE8S4I5yhEM5jn+YEW0LPbp99Wz1d1Ob4GI1t0hCEv/4ayC3nRIXkuIhl7mrV0s22F8NE8f0hZZKaw1u8xmmpbD8ZVBsC6cxWE3kIQBmHu8q9tylaZjLsjGxBDUF9ko3bxeppvLPDMem89VLQCWbgmOHl5ZIPsyNglusTIBUp8uA7g+Agz1uMojClMHnsZl68WjbCAVcRA9y/UgXphGyEYZCUJMv8CjYKzxriyHALZl6YFSyC5ELlEAxL8fyTwtXhQ1+e/lI9Ak3n4suC6JyH0NQ27MPIf7riyUFJLw9lZaDerZOkvI7/Y2PfRvdfyZ57g/xgGeLY0Ch30SFVC04lNXIpsOWbLBOg0BMP9ZiciAYAF9Yc=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIreWuVcekgp7kF5pU+4TIKLHZyhuqd4Ly312ExEA5EG
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJWfXOTsTXqDhdGhW7VcUXsYqCS7TzCPyaa9/dA9e0xKjnni1/GRM8FdYXWYbGsNnBQFWk3/pXD6sj3jKzK34AM=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDWbrXZxuAw0n/xJmOvWW/Qbg53ya2CuJKzcHA+OvDpHLHGxkEuiUhwKvqUbfSTzn0o1M00OYITJIvZVINGRtQC7hGvBPWLVBON097mcmnju857I72U3dGdvGhnEUHyrglCV+xSkafQTTlnY9B59EKImUs/kiwRy3cYDWkCgthJgiPA4QSw6WrzaqpY2ET+7n+yY31EOagGA3ufW43qFbHX4diFuXpS1I1PLvvA4KINlMlsFcyR29j4nQk/vb5hMpLmBOlfVH16CXZC98a0ltp9ib7F3e1Wjdogj92kxwfQMYIeQEBp11Tc/PY5U90J51oyk8xYOKfsP3+r9yczmfRDjwR3+tzUMKyZYAsKQVcOGQC7x9sEXg3mBeXRVrlIVZFMuNVcYq4CY40fDIybcI25GxgRbQR7ZUWODG1SL7RF02Z+LQB6APXkzxdQUWLWPryj/EtOgnHQ1I0+BJTWrqGkKbSj41jhRTfS+MZvRXAJ+fNyZFhpkHo54DrCii4cbyM=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGRPkwTcFVg/dIKRq29iWBfkoVFqIQ1pXOCPxfcGWRFF
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGf/hJ2dg/PRwojw63FLyKqua+ChKP+2bc7Eb0p70H6ve1elFVeY8lVRXx33JWc2m/XfgSWPNcUs9zBG8QcFVak=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDA/6JnQZ3CFC7xgv4DrvdZizVbVnsolKcWkvqzGu1hFHGmOEb7ehbxGPHBnp2N9iRf13H12EI0qNI6A2f44V0oXE3SP+fpJ6PVYQRQpKqTEiweqZaHEyYE2FnKy0HDQisg5hwr1egYLjGXChdkyqWSokL1LqaCyD2+EcOzUvC/GuVQ7eQnQBIGBpYAnNzS/64KKOZ0+0soOPJGxVCma6JN/2GcCunX6j3HmkOOQeuEFETXfUPHh1ylu2+3yINl34ERJN5YwgR/S+BKENOsJTu5XkYTCvc90CuvfkoF9K5Y2yE5nKwZaSf7n2SbUPil2Zph4l7opsd5IKxi6k2mVzw/CO2NHr136BZ06+sKXytDgorWqWzqnci8zfxeYF3D7q7AXD+IDVMP5T6op93oS2enAQFHG1vTLB0otQqnxUgNANbJkrKgXAS8G8I1m2sPz+qOFuuZa2/nqhzrd6/DEur5VoW6n9c/OcrbfapLEzD1jQDmsQI7oZkT++dt3Ogb3Vk=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIII1sLqY7Nqi1A3CKXLokfn1vrns/lK1gUkDNSlbek2o
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM9QZXHUsthFMKA5Si4Htl7MIwK0G4VAltQgbo39JJHrgD7h27U1jbnuJQ1S2bBX8FMSkqf5TPmM7Gr9QOATO+4=
                                             create=True mode=0644 path=/tmp/ansible.tqrg3z1f state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:41 compute-2 sudo[68847]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:41 compute-2 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 23 09:45:41 compute-2 sudo[69001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydyoxcfdrkhxbevpxnmrsawosnimwwxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161541.4311516-136-37518427146674/AnsiballZ_command.py'
Jan 23 09:45:41 compute-2 sudo[69001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:42 compute-2 python3.9[69003]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.tqrg3z1f' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:45:42 compute-2 sudo[69001]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:42 compute-2 sudo[69155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejvhxsjjqwmeyhtyqjlhplunqgsayocj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161542.2832441-159-105083682506342/AnsiballZ_file.py'
Jan 23 09:45:42 compute-2 sudo[69155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:42 compute-2 python3.9[69157]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.tqrg3z1f state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:42 compute-2 sudo[69155]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:43 compute-2 sshd-session[68241]: Connection closed by 192.168.122.30 port 56128
Jan 23 09:45:43 compute-2 sshd-session[68238]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:45:43 compute-2 systemd[1]: session-16.scope: Deactivated successfully.
Jan 23 09:45:43 compute-2 systemd[1]: session-16.scope: Consumed 3.243s CPU time.
Jan 23 09:45:43 compute-2 systemd-logind[786]: Session 16 logged out. Waiting for processes to exit.
Jan 23 09:45:43 compute-2 systemd-logind[786]: Removed session 16.
Jan 23 09:45:49 compute-2 sshd-session[69182]: Accepted publickey for zuul from 192.168.122.30 port 42804 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:45:49 compute-2 systemd-logind[786]: New session 17 of user zuul.
Jan 23 09:45:49 compute-2 systemd[1]: Started Session 17 of User zuul.
Jan 23 09:45:49 compute-2 sshd-session[69182]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:45:50 compute-2 python3.9[69335]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:45:51 compute-2 sudo[69489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxjuqgofzjcxvvbajlslnwxwitucxkez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161551.0636861-54-74889073561978/AnsiballZ_systemd.py'
Jan 23 09:45:51 compute-2 sudo[69489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:52 compute-2 python3.9[69491]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 23 09:45:52 compute-2 sudo[69489]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:52 compute-2 sudo[69643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhccxazzelkrfijlpxztmetqsyjbfxix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161552.3841157-77-256022550555532/AnsiballZ_systemd.py'
Jan 23 09:45:52 compute-2 sudo[69643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:52 compute-2 python3.9[69645]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:45:52 compute-2 sudo[69643]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:53 compute-2 sudo[69796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rocxskcdaaqayhddeaaayyjwlibrgtze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161553.2756212-105-280720415824998/AnsiballZ_command.py'
Jan 23 09:45:53 compute-2 sudo[69796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:54 compute-2 python3.9[69798]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:45:54 compute-2 sudo[69796]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:54 compute-2 sudo[69949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdortivzosclzlppfasxisyugdiwlvri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161554.2723217-128-74354229740166/AnsiballZ_stat.py'
Jan 23 09:45:54 compute-2 sudo[69949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:54 compute-2 python3.9[69951]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:45:54 compute-2 sudo[69949]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:55 compute-2 sudo[70103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liqbhndhovkwnjvgaurzobsbyggyjzem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161555.0771022-152-78397292310663/AnsiballZ_command.py'
Jan 23 09:45:55 compute-2 sudo[70103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:55 compute-2 python3.9[70105]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:45:55 compute-2 sudo[70103]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:56 compute-2 sudo[70258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vigjodfyozucrvjhrpunfvphlzjphizf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161555.8474362-176-93425839140530/AnsiballZ_file.py'
Jan 23 09:45:56 compute-2 sudo[70258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:56 compute-2 python3.9[70260]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:56 compute-2 sudo[70258]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:57 compute-2 sshd-session[69185]: Connection closed by 192.168.122.30 port 42804
Jan 23 09:45:57 compute-2 sshd-session[69182]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:45:57 compute-2 systemd[1]: session-17.scope: Deactivated successfully.
Jan 23 09:45:57 compute-2 systemd[1]: session-17.scope: Consumed 4.254s CPU time.
Jan 23 09:45:57 compute-2 systemd-logind[786]: Session 17 logged out. Waiting for processes to exit.
Jan 23 09:45:57 compute-2 systemd-logind[786]: Removed session 17.
Jan 23 09:46:02 compute-2 sshd-session[70285]: Accepted publickey for zuul from 192.168.122.30 port 54126 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:46:02 compute-2 systemd-logind[786]: New session 18 of user zuul.
Jan 23 09:46:02 compute-2 systemd[1]: Started Session 18 of User zuul.
Jan 23 09:46:02 compute-2 sshd-session[70285]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:46:03 compute-2 python3.9[70438]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:46:04 compute-2 sudo[70592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pblqudbwwszrzqfboutoytcoegxvxhtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161564.0859637-59-63567121170338/AnsiballZ_setup.py'
Jan 23 09:46:04 compute-2 sudo[70592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:04 compute-2 python3.9[70594]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:46:05 compute-2 sudo[70592]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:06 compute-2 sudo[70676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbpoykigrmjrlmaugzbbqljlvpiycteq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161564.0859637-59-63567121170338/AnsiballZ_dnf.py'
Jan 23 09:46:06 compute-2 sudo[70676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:06 compute-2 python3.9[70678]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 09:46:07 compute-2 sudo[70676]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:08 compute-2 python3.9[70829]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:46:10 compute-2 python3.9[70980]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 09:46:10 compute-2 python3.9[71130]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:46:10 compute-2 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 09:46:10 compute-2 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 09:46:11 compute-2 python3.9[71281]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:46:12 compute-2 sshd-session[70288]: Connection closed by 192.168.122.30 port 54126
Jan 23 09:46:12 compute-2 sshd-session[70285]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:46:12 compute-2 systemd-logind[786]: Session 18 logged out. Waiting for processes to exit.
Jan 23 09:46:12 compute-2 systemd[1]: session-18.scope: Deactivated successfully.
Jan 23 09:46:12 compute-2 systemd[1]: session-18.scope: Consumed 5.877s CPU time.
Jan 23 09:46:12 compute-2 systemd-logind[786]: Removed session 18.
Jan 23 09:46:20 compute-2 sshd-session[71306]: Accepted publickey for zuul from 38.129.56.17 port 45444 ssh2: RSA SHA256:/TrmfiPCpRhp7iDH6L+XY56Icv2RRStSYrCVh8OnXTQ
Jan 23 09:46:20 compute-2 systemd-logind[786]: New session 19 of user zuul.
Jan 23 09:46:20 compute-2 systemd[1]: Started Session 19 of User zuul.
Jan 23 09:46:20 compute-2 sshd-session[71306]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:46:21 compute-2 sudo[71382]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sszljzixlufcsgfjtjpwvrcatuvdzpxj ; /usr/bin/python3'
Jan 23 09:46:21 compute-2 sudo[71382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:21 compute-2 useradd[71386]: new group: name=ceph-admin, GID=42478
Jan 23 09:46:21 compute-2 useradd[71386]: new user: name=ceph-admin, UID=42477, GID=42478, home=/home/ceph-admin, shell=/bin/bash, from=none
Jan 23 09:46:21 compute-2 sudo[71382]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:21 compute-2 sudo[71468]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aiybcfdizawokfabhpgfroggserfavpx ; /usr/bin/python3'
Jan 23 09:46:21 compute-2 sudo[71468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:21 compute-2 sudo[71468]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:22 compute-2 sudo[71541]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxinmywxtopulmflwgwikrklwuzxqhcg ; /usr/bin/python3'
Jan 23 09:46:22 compute-2 sudo[71541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:22 compute-2 sudo[71541]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:22 compute-2 sudo[71591]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdcmddtkehreshlbwpcivjkfxllcmrgq ; /usr/bin/python3'
Jan 23 09:46:22 compute-2 sudo[71591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:23 compute-2 sudo[71591]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:23 compute-2 sudo[71617]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpyhjdjzphqkmzytklonaltkwqoovcpx ; /usr/bin/python3'
Jan 23 09:46:23 compute-2 sudo[71617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:23 compute-2 sudo[71617]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:23 compute-2 sudo[71643]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyuxtvnvrzpeixefaylmyjexhhvutabw ; /usr/bin/python3'
Jan 23 09:46:23 compute-2 sudo[71643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:23 compute-2 sudo[71643]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:24 compute-2 sudo[71669]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zktasotylzyfjljzucvraybxdcgfxjwl ; /usr/bin/python3'
Jan 23 09:46:24 compute-2 sudo[71669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:24 compute-2 sudo[71669]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:24 compute-2 sudo[71747]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wiqumsyshqcieaodamztczdetzrehhxv ; /usr/bin/python3'
Jan 23 09:46:24 compute-2 sudo[71747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:24 compute-2 sudo[71747]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:24 compute-2 sudo[71820]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgowomokwpwkzefnoymvioihvbwmpiuj ; /usr/bin/python3'
Jan 23 09:46:24 compute-2 sudo[71820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:25 compute-2 sudo[71820]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:25 compute-2 sudo[71922]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blhkwmpdgvxmlenfbunixewaaozozofz ; /usr/bin/python3'
Jan 23 09:46:25 compute-2 sudo[71922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:25 compute-2 sudo[71922]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:25 compute-2 sudo[71995]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oulwwerqyyusyvmhwkmselaruzaeowqc ; /usr/bin/python3'
Jan 23 09:46:25 compute-2 sudo[71995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:26 compute-2 sudo[71995]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:26 compute-2 sudo[72045]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gohqwwvomexafpdrmuupraknuxydtgsl ; /usr/bin/python3'
Jan 23 09:46:26 compute-2 sudo[72045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:26 compute-2 python3[72047]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:46:27 compute-2 sudo[72045]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:28 compute-2 sudo[72140]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eitudhyauxdubwkzcoryemluasfveksd ; /usr/bin/python3'
Jan 23 09:46:28 compute-2 sudo[72140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:28 compute-2 python3[72142]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 23 09:46:30 compute-2 sudo[72140]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:30 compute-2 sudo[72167]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swgsfojtaxkhxgdxtffxlrrnmrjfnmfx ; /usr/bin/python3'
Jan 23 09:46:30 compute-2 sudo[72167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:30 compute-2 python3[72169]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 23 09:46:30 compute-2 sudo[72167]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:30 compute-2 sudo[72193]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pseckdjrbqsirxgnxjanmwdtaxlvntbb ; /usr/bin/python3'
Jan 23 09:46:30 compute-2 sudo[72193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:30 compute-2 python3[72195]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=20G
                                          losetup /dev/loop3 /var/lib/ceph-osd-0.img
                                          lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:46:30 compute-2 kernel: loop: module loaded
Jan 23 09:46:30 compute-2 kernel: loop3: detected capacity change from 0 to 41943040
Jan 23 09:46:30 compute-2 sudo[72193]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:31 compute-2 sudo[72228]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiokpmnqnosybworydnvjdfmwsibzkxv ; /usr/bin/python3'
Jan 23 09:46:31 compute-2 sudo[72228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:31 compute-2 python3[72230]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3
                                          vgcreate ceph_vg0 /dev/loop3
                                          lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0
                                          lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:46:31 compute-2 lvm[72233]: PV /dev/loop3 not used.
Jan 23 09:46:31 compute-2 lvm[72242]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 09:46:31 compute-2 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Jan 23 09:46:31 compute-2 lvm[72244]:   1 logical volume(s) in volume group "ceph_vg0" now active
Jan 23 09:46:31 compute-2 sudo[72228]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:31 compute-2 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Jan 23 09:46:31 compute-2 chronyd[58525]: Selected source 167.160.187.179 (pool.ntp.org)
Jan 23 09:46:32 compute-2 sudo[72320]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxyfgcidzugzglrpgkceikysswffbelt ; /usr/bin/python3'
Jan 23 09:46:32 compute-2 sudo[72320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:32 compute-2 python3[72322]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:46:32 compute-2 sudo[72320]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:32 compute-2 sudo[72393]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkrufcunsfrbnjxookrpodohyfewntsx ; /usr/bin/python3'
Jan 23 09:46:32 compute-2 sudo[72393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:32 compute-2 python3[72395]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769161591.9139266-37005-66242161926865/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:46:32 compute-2 sudo[72393]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:33 compute-2 sudo[72443]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsstprbqegcvmfnkjrxvgzggvkzfasex ; /usr/bin/python3'
Jan 23 09:46:33 compute-2 sudo[72443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:33 compute-2 python3[72445]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:46:33 compute-2 systemd[1]: Reloading.
Jan 23 09:46:33 compute-2 systemd-rc-local-generator[72469]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:46:33 compute-2 systemd-sysv-generator[72475]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:46:33 compute-2 systemd[1]: Starting Ceph OSD losetup...
Jan 23 09:46:33 compute-2 bash[72486]: /dev/loop3: [64513]:4328453 (/var/lib/ceph-osd-0.img)
Jan 23 09:46:33 compute-2 systemd[1]: Finished Ceph OSD losetup.
Jan 23 09:46:33 compute-2 lvm[72487]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 09:46:33 compute-2 lvm[72487]: VG ceph_vg0 finished
Jan 23 09:46:33 compute-2 sudo[72443]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:36 compute-2 python3[72511]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:48:49 compute-2 sshd-session[72555]: Accepted publickey for ceph-admin from 192.168.122.100 port 37016 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:48:49 compute-2 systemd[1]: Created slice User Slice of UID 42477.
Jan 23 09:48:49 compute-2 systemd[1]: Starting User Runtime Directory /run/user/42477...
Jan 23 09:48:49 compute-2 systemd-logind[786]: New session 20 of user ceph-admin.
Jan 23 09:48:49 compute-2 systemd[1]: Finished User Runtime Directory /run/user/42477.
Jan 23 09:48:49 compute-2 systemd[1]: Starting User Manager for UID 42477...
Jan 23 09:48:49 compute-2 systemd[72559]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:48:49 compute-2 systemd[72559]: Queued start job for default target Main User Target.
Jan 23 09:48:49 compute-2 sshd-session[72573]: Accepted publickey for ceph-admin from 192.168.122.100 port 37018 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:48:49 compute-2 systemd-logind[786]: New session 22 of user ceph-admin.
Jan 23 09:48:49 compute-2 systemd[72559]: Created slice User Application Slice.
Jan 23 09:48:49 compute-2 systemd[72559]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:48:49 compute-2 systemd[72559]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 09:48:49 compute-2 systemd[72559]: Reached target Paths.
Jan 23 09:48:49 compute-2 systemd[72559]: Reached target Timers.
Jan 23 09:48:49 compute-2 systemd[72559]: Starting D-Bus User Message Bus Socket...
Jan 23 09:48:49 compute-2 systemd[72559]: Starting Create User's Volatile Files and Directories...
Jan 23 09:48:49 compute-2 systemd[72559]: Finished Create User's Volatile Files and Directories.
Jan 23 09:48:49 compute-2 systemd[72559]: Listening on D-Bus User Message Bus Socket.
Jan 23 09:48:49 compute-2 systemd[72559]: Reached target Sockets.
Jan 23 09:48:49 compute-2 systemd[72559]: Reached target Basic System.
Jan 23 09:48:49 compute-2 systemd[72559]: Reached target Main User Target.
Jan 23 09:48:49 compute-2 systemd[72559]: Startup finished in 124ms.
Jan 23 09:48:49 compute-2 systemd[1]: Started User Manager for UID 42477.
Jan 23 09:48:49 compute-2 systemd[1]: Started Session 20 of User ceph-admin.
Jan 23 09:48:49 compute-2 systemd[1]: Started Session 22 of User ceph-admin.
Jan 23 09:48:49 compute-2 sshd-session[72555]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:48:49 compute-2 sshd-session[72573]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:48:49 compute-2 sudo[72580]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:48:49 compute-2 sudo[72580]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:49 compute-2 sudo[72580]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:49 compute-2 sshd-session[72605]: Accepted publickey for ceph-admin from 192.168.122.100 port 37028 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:48:49 compute-2 systemd-logind[786]: New session 23 of user ceph-admin.
Jan 23 09:48:49 compute-2 systemd[1]: Started Session 23 of User ceph-admin.
Jan 23 09:48:49 compute-2 sshd-session[72605]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:48:49 compute-2 sudo[72609]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host --expect-hostname compute-2
Jan 23 09:48:49 compute-2 sudo[72609]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:49 compute-2 sudo[72609]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:50 compute-2 sshd-session[72634]: Accepted publickey for ceph-admin from 192.168.122.100 port 37040 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:48:50 compute-2 systemd-logind[786]: New session 24 of user ceph-admin.
Jan 23 09:48:50 compute-2 systemd[1]: Started Session 24 of User ceph-admin.
Jan 23 09:48:50 compute-2 sshd-session[72634]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:48:50 compute-2 sudo[72638]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36
Jan 23 09:48:50 compute-2 sudo[72638]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:50 compute-2 sudo[72638]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:50 compute-2 sshd-session[72663]: Accepted publickey for ceph-admin from 192.168.122.100 port 37050 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:48:50 compute-2 systemd-logind[786]: New session 25 of user ceph-admin.
Jan 23 09:48:50 compute-2 systemd[1]: Started Session 25 of User ceph-admin.
Jan 23 09:48:50 compute-2 sshd-session[72663]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:48:50 compute-2 sudo[72667]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:48:50 compute-2 sudo[72667]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:50 compute-2 sudo[72667]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:50 compute-2 sshd-session[72692]: Accepted publickey for ceph-admin from 192.168.122.100 port 37064 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:48:50 compute-2 systemd-logind[786]: New session 26 of user ceph-admin.
Jan 23 09:48:50 compute-2 systemd[1]: Started Session 26 of User ceph-admin.
Jan 23 09:48:50 compute-2 sshd-session[72692]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:48:50 compute-2 sudo[72696]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:48:50 compute-2 sudo[72696]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:50 compute-2 sudo[72696]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:51 compute-2 sshd-session[72721]: Accepted publickey for ceph-admin from 192.168.122.100 port 37078 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:48:51 compute-2 systemd-logind[786]: New session 27 of user ceph-admin.
Jan 23 09:48:51 compute-2 systemd[1]: Started Session 27 of User ceph-admin.
Jan 23 09:48:51 compute-2 sshd-session[72721]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:48:51 compute-2 sudo[72725]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36.new
Jan 23 09:48:51 compute-2 sudo[72725]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:51 compute-2 sudo[72725]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:51 compute-2 sshd-session[72750]: Accepted publickey for ceph-admin from 192.168.122.100 port 52848 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:48:51 compute-2 systemd-logind[786]: New session 28 of user ceph-admin.
Jan 23 09:48:51 compute-2 systemd[1]: Started Session 28 of User ceph-admin.
Jan 23 09:48:51 compute-2 sshd-session[72750]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:48:51 compute-2 sudo[72754]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:48:51 compute-2 sudo[72754]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:51 compute-2 sudo[72754]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:51 compute-2 sshd-session[72779]: Accepted publickey for ceph-admin from 192.168.122.100 port 52858 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:48:51 compute-2 systemd-logind[786]: New session 29 of user ceph-admin.
Jan 23 09:48:51 compute-2 systemd[1]: Started Session 29 of User ceph-admin.
Jan 23 09:48:51 compute-2 sshd-session[72779]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:48:51 compute-2 sudo[72783]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36.new
Jan 23 09:48:51 compute-2 sudo[72783]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:51 compute-2 sudo[72783]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:51 compute-2 sshd-session[72808]: Accepted publickey for ceph-admin from 192.168.122.100 port 52868 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:48:51 compute-2 systemd-logind[786]: New session 30 of user ceph-admin.
Jan 23 09:48:52 compute-2 systemd[1]: Started Session 30 of User ceph-admin.
Jan 23 09:48:52 compute-2 sshd-session[72808]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:48:53 compute-2 sshd-session[72835]: Accepted publickey for ceph-admin from 192.168.122.100 port 52876 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:48:53 compute-2 systemd-logind[786]: New session 31 of user ceph-admin.
Jan 23 09:48:53 compute-2 systemd[1]: Started Session 31 of User ceph-admin.
Jan 23 09:48:53 compute-2 sshd-session[72835]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:48:53 compute-2 sudo[72839]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36.new /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36
Jan 23 09:48:53 compute-2 sudo[72839]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:53 compute-2 sudo[72839]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:53 compute-2 sshd-session[72864]: Accepted publickey for ceph-admin from 192.168.122.100 port 52892 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:48:53 compute-2 systemd-logind[786]: New session 32 of user ceph-admin.
Jan 23 09:48:53 compute-2 systemd[1]: Started Session 32 of User ceph-admin.
Jan 23 09:48:53 compute-2 sshd-session[72864]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:48:53 compute-2 sudo[72868]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host --expect-hostname compute-2
Jan 23 09:48:53 compute-2 sudo[72868]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:53 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:48:53 compute-2 sudo[72868]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:53 compute-2 sudo[72914]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 09:49:53 compute-2 sudo[72914]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:53 compute-2 sudo[72914]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:53 compute-2 sudo[72939]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:49:53 compute-2 sudo[72939]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:53 compute-2 sudo[72939]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:53 compute-2 sudo[72964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Jan 23 09:49:53 compute-2 sudo[72964]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:53 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:49:53 compute-2 sudo[72964]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:53 compute-2 sudo[73009]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:49:53 compute-2 sudo[73009]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:53 compute-2 sudo[73009]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:54 compute-2 sudo[73034]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 23 09:49:54 compute-2 sudo[73034]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:54 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:49:54 compute-2 sudo[73034]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:54 compute-2 sudo[73093]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:49:54 compute-2 sudo[73093]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:54 compute-2 sudo[73093]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:54 compute-2 sudo[73118]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 09:49:54 compute-2 sudo[73118]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:54 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:49:54 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:49:54 compute-2 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73155 (sysctl)
Jan 23 09:49:54 compute-2 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 23 09:49:54 compute-2 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 23 09:49:55 compute-2 sudo[73118]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:55 compute-2 sudo[73178]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:49:55 compute-2 sudo[73178]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:55 compute-2 sudo[73178]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:55 compute-2 sudo[73203]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Jan 23 09:49:55 compute-2 sudo[73203]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:55 compute-2 sudo[73203]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:55 compute-2 sudo[73246]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:49:55 compute-2 sudo[73246]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:55 compute-2 sudo[73246]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:55 compute-2 sudo[73271]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid f3005f84-239a-55b6-a948-8f1fb592b920 -- inventory --format=json-pretty --filter-for-batch
Jan 23 09:49:55 compute-2 sudo[73271]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:55 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:49:55 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:49:55 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:49:58 compute-2 systemd[1]: var-lib-containers-storage-overlay-compat2815177462-lower\x2dmapped.mount: Deactivated successfully.
Jan 23 09:50:33 compute-2 podman[73332]: 2026-01-23 09:50:33.868790058 +0000 UTC m=+38.084395360 container create e46c5f66cf11247be75b9fc4be34bb5cb9a087e014c010fbe1dfe03c8217d7de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hungry_hermann, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Jan 23 09:50:33 compute-2 podman[73332]: 2026-01-23 09:50:33.853240388 +0000 UTC m=+38.068845710 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:50:33 compute-2 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 23 09:50:33 compute-2 systemd[1]: Started libpod-conmon-e46c5f66cf11247be75b9fc4be34bb5cb9a087e014c010fbe1dfe03c8217d7de.scope.
Jan 23 09:50:33 compute-2 systemd[1]: Started libcrun container.
Jan 23 09:50:33 compute-2 podman[73332]: 2026-01-23 09:50:33.97914756 +0000 UTC m=+38.194752882 container init e46c5f66cf11247be75b9fc4be34bb5cb9a087e014c010fbe1dfe03c8217d7de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hungry_hermann, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Jan 23 09:50:33 compute-2 podman[73332]: 2026-01-23 09:50:33.98780466 +0000 UTC m=+38.203409982 container start e46c5f66cf11247be75b9fc4be34bb5cb9a087e014c010fbe1dfe03c8217d7de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hungry_hermann, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 09:50:33 compute-2 podman[73332]: 2026-01-23 09:50:33.99168024 +0000 UTC m=+38.207285592 container attach e46c5f66cf11247be75b9fc4be34bb5cb9a087e014c010fbe1dfe03c8217d7de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hungry_hermann, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 09:50:33 compute-2 hungry_hermann[73400]: 167 167
Jan 23 09:50:33 compute-2 systemd[1]: libpod-e46c5f66cf11247be75b9fc4be34bb5cb9a087e014c010fbe1dfe03c8217d7de.scope: Deactivated successfully.
Jan 23 09:50:33 compute-2 podman[73332]: 2026-01-23 09:50:33.9951296 +0000 UTC m=+38.210734912 container died e46c5f66cf11247be75b9fc4be34bb5cb9a087e014c010fbe1dfe03c8217d7de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hungry_hermann, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS)
Jan 23 09:50:34 compute-2 systemd[1]: var-lib-containers-storage-overlay-30ee5341d832c697ce3d7f154ce66edfcc7e7c0b309659f122d027af60b99079-merged.mount: Deactivated successfully.
Jan 23 09:50:34 compute-2 podman[73332]: 2026-01-23 09:50:34.042870864 +0000 UTC m=+38.258476166 container remove e46c5f66cf11247be75b9fc4be34bb5cb9a087e014c010fbe1dfe03c8217d7de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hungry_hermann, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:50:34 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:50:34 compute-2 systemd[1]: libpod-conmon-e46c5f66cf11247be75b9fc4be34bb5cb9a087e014c010fbe1dfe03c8217d7de.scope: Deactivated successfully.
Jan 23 09:50:34 compute-2 podman[73424]: 2026-01-23 09:50:34.205366881 +0000 UTC m=+0.048402000 container create d25dbb6340a3047dfe5d8f4772aa594b733d66e3f295c8f12c0768ac6520e359 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_solomon, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Jan 23 09:50:34 compute-2 systemd[1]: Started libpod-conmon-d25dbb6340a3047dfe5d8f4772aa594b733d66e3f295c8f12c0768ac6520e359.scope.
Jan 23 09:50:34 compute-2 systemd[1]: Started libcrun container.
Jan 23 09:50:34 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4adaf7497dcf8b5afe505ff619eee9241ce1e42c7af5be057c7e964eef7d6d49/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:34 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4adaf7497dcf8b5afe505ff619eee9241ce1e42c7af5be057c7e964eef7d6d49/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:34 compute-2 podman[73424]: 2026-01-23 09:50:34.185239175 +0000 UTC m=+0.028274314 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:50:34 compute-2 podman[73424]: 2026-01-23 09:50:34.293197761 +0000 UTC m=+0.136232880 container init d25dbb6340a3047dfe5d8f4772aa594b733d66e3f295c8f12c0768ac6520e359 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_solomon, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 09:50:34 compute-2 podman[73424]: 2026-01-23 09:50:34.303404158 +0000 UTC m=+0.146439267 container start d25dbb6340a3047dfe5d8f4772aa594b733d66e3f295c8f12c0768ac6520e359 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_solomon, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0)
Jan 23 09:50:34 compute-2 podman[73424]: 2026-01-23 09:50:34.308019874 +0000 UTC m=+0.151054993 container attach d25dbb6340a3047dfe5d8f4772aa594b733d66e3f295c8f12c0768ac6520e359 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_solomon, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]: [
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:     {
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:         "available": false,
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:         "being_replaced": false,
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:         "ceph_device_lvm": false,
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:         "device_id": "QEMU_DVD-ROM_QM00001",
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:         "lsm_data": {},
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:         "lvs": [],
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:         "path": "/dev/sr0",
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:         "rejected_reasons": [
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:             "Has a FileSystem",
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:             "Insufficient space (<5GB)"
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:         ],
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:         "sys_api": {
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:             "actuators": null,
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:             "device_nodes": [
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:                 "sr0"
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:             ],
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:             "devname": "sr0",
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:             "human_readable_size": "482.00 KB",
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:             "id_bus": "ata",
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:             "model": "QEMU DVD-ROM",
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:             "nr_requests": "2",
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:             "parent": "/dev/sr0",
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:             "partitions": {},
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:             "path": "/dev/sr0",
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:             "removable": "1",
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:             "rev": "2.5+",
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:             "ro": "0",
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:             "rotational": "1",
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:             "sas_address": "",
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:             "sas_device_handle": "",
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:             "scheduler_mode": "mq-deadline",
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:             "sectors": 0,
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:             "sectorsize": "2048",
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:             "size": 493568.0,
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:             "support_discard": "2048",
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:             "type": "disk",
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:             "vendor": "QEMU"
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:         }
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]:     }
Jan 23 09:50:35 compute-2 vigilant_solomon[73440]: ]
Jan 23 09:50:35 compute-2 systemd[1]: libpod-d25dbb6340a3047dfe5d8f4772aa594b733d66e3f295c8f12c0768ac6520e359.scope: Deactivated successfully.
Jan 23 09:50:35 compute-2 podman[73424]: 2026-01-23 09:50:35.079521892 +0000 UTC m=+0.922557041 container died d25dbb6340a3047dfe5d8f4772aa594b733d66e3f295c8f12c0768ac6520e359 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_solomon, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid)
Jan 23 09:50:35 compute-2 systemd[1]: var-lib-containers-storage-overlay-4adaf7497dcf8b5afe505ff619eee9241ce1e42c7af5be057c7e964eef7d6d49-merged.mount: Deactivated successfully.
Jan 23 09:50:35 compute-2 podman[73424]: 2026-01-23 09:50:35.120900119 +0000 UTC m=+0.963935238 container remove d25dbb6340a3047dfe5d8f4772aa594b733d66e3f295c8f12c0768ac6520e359 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_solomon, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid)
Jan 23 09:50:35 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:50:35 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:50:35 compute-2 systemd[1]: libpod-conmon-d25dbb6340a3047dfe5d8f4772aa594b733d66e3f295c8f12c0768ac6520e359.scope: Deactivated successfully.
Jan 23 09:50:35 compute-2 sudo[73271]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:35 compute-2 sudo[74444]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Jan 23 09:50:35 compute-2 sudo[74444]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:35 compute-2 sudo[74444]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:35 compute-2 sudo[74469]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph
Jan 23 09:50:35 compute-2 sudo[74469]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:35 compute-2 sudo[74469]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:35 compute-2 sudo[74494]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:50:35 compute-2 sudo[74494]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:35 compute-2 sudo[74494]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:35 compute-2 sudo[74519]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:50:35 compute-2 sudo[74519]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:35 compute-2 sudo[74519]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:35 compute-2 sudo[74544]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:50:35 compute-2 sudo[74544]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:35 compute-2 sudo[74544]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:35 compute-2 sudo[74592]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:50:35 compute-2 sudo[74592]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:35 compute-2 sudo[74592]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:35 compute-2 sudo[74617]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:50:35 compute-2 sudo[74617]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:35 compute-2 sudo[74617]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:35 compute-2 sudo[74642]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Jan 23 09:50:35 compute-2 sudo[74642]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:35 compute-2 sudo[74642]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:35 compute-2 sudo[74667]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:50:35 compute-2 sudo[74667]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:35 compute-2 sudo[74667]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:35 compute-2 sudo[74692]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:50:35 compute-2 sudo[74692]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:35 compute-2 sudo[74692]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:35 compute-2 sudo[74717]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:50:35 compute-2 sudo[74717]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:35 compute-2 sudo[74717]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:36 compute-2 sudo[74742]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:50:36 compute-2 sudo[74742]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:36 compute-2 sudo[74742]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:36 compute-2 sudo[74767]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:50:36 compute-2 sudo[74767]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:36 compute-2 sudo[74767]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:36 compute-2 sudo[74815]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:50:36 compute-2 sudo[74815]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:36 compute-2 sudo[74815]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:36 compute-2 sudo[74840]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:50:36 compute-2 sudo[74840]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:36 compute-2 sudo[74840]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:36 compute-2 sudo[74865]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:50:36 compute-2 sudo[74865]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:36 compute-2 sudo[74865]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:36 compute-2 sudo[74890]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Jan 23 09:50:36 compute-2 sudo[74890]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:36 compute-2 sudo[74890]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:36 compute-2 sudo[74915]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph
Jan 23 09:50:36 compute-2 sudo[74915]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:36 compute-2 sudo[74915]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:36 compute-2 sudo[74940]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:50:36 compute-2 sudo[74940]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:36 compute-2 sudo[74940]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:36 compute-2 sudo[74965]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:50:36 compute-2 sudo[74965]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:36 compute-2 sudo[74965]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:36 compute-2 sudo[74990]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:50:36 compute-2 sudo[74990]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:36 compute-2 sudo[74990]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:36 compute-2 sudo[75038]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:50:36 compute-2 sudo[75038]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:36 compute-2 sudo[75038]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:36 compute-2 sudo[75063]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:50:36 compute-2 sudo[75063]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:36 compute-2 sudo[75063]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:36 compute-2 sudo[75088]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Jan 23 09:50:36 compute-2 sudo[75088]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:36 compute-2 sudo[75088]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:36 compute-2 sudo[75113]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:50:36 compute-2 sudo[75113]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:36 compute-2 sudo[75113]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:36 compute-2 sudo[75138]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:50:36 compute-2 sudo[75138]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:36 compute-2 sudo[75138]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:36 compute-2 sudo[75163]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:50:36 compute-2 sudo[75163]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:36 compute-2 sudo[75163]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:37 compute-2 sudo[75188]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:50:37 compute-2 sudo[75188]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:37 compute-2 sudo[75188]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:37 compute-2 sudo[75213]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:50:37 compute-2 sudo[75213]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:37 compute-2 sudo[75213]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:37 compute-2 sudo[75261]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:50:37 compute-2 sudo[75261]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:37 compute-2 sudo[75261]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:37 compute-2 sudo[75286]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:50:37 compute-2 sudo[75286]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:37 compute-2 sudo[75286]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:37 compute-2 sudo[75311]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:50:37 compute-2 sudo[75311]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:37 compute-2 sudo[75311]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:37 compute-2 sudo[75336]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:50:37 compute-2 sudo[75336]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:37 compute-2 sudo[75336]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:37 compute-2 sudo[75361]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:50:37 compute-2 sudo[75361]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:37 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:50:37 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:50:37 compute-2 podman[75427]: 2026-01-23 09:50:37.994391355 +0000 UTC m=+0.039229438 container create 8e92715603959a0beed35407b13bd59ae771800bd98b1c2683803b6deaeba39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cranky_roentgen, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Jan 23 09:50:38 compute-2 systemd[1]: Started libpod-conmon-8e92715603959a0beed35407b13bd59ae771800bd98b1c2683803b6deaeba39c.scope.
Jan 23 09:50:38 compute-2 systemd[1]: Started libcrun container.
Jan 23 09:50:38 compute-2 podman[75427]: 2026-01-23 09:50:38.055816766 +0000 UTC m=+0.100654859 container init 8e92715603959a0beed35407b13bd59ae771800bd98b1c2683803b6deaeba39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cranky_roentgen, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Jan 23 09:50:38 compute-2 podman[75427]: 2026-01-23 09:50:38.061128239 +0000 UTC m=+0.105966322 container start 8e92715603959a0beed35407b13bd59ae771800bd98b1c2683803b6deaeba39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cranky_roentgen, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Jan 23 09:50:38 compute-2 cranky_roentgen[75444]: 167 167
Jan 23 09:50:38 compute-2 systemd[1]: libpod-8e92715603959a0beed35407b13bd59ae771800bd98b1c2683803b6deaeba39c.scope: Deactivated successfully.
Jan 23 09:50:38 compute-2 podman[75427]: 2026-01-23 09:50:38.070163457 +0000 UTC m=+0.115001540 container attach 8e92715603959a0beed35407b13bd59ae771800bd98b1c2683803b6deaeba39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cranky_roentgen, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325)
Jan 23 09:50:38 compute-2 podman[75427]: 2026-01-23 09:50:38.070788451 +0000 UTC m=+0.115626574 container died 8e92715603959a0beed35407b13bd59ae771800bd98b1c2683803b6deaeba39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cranky_roentgen, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 09:50:38 compute-2 podman[75427]: 2026-01-23 09:50:37.977363262 +0000 UTC m=+0.022201375 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:50:38 compute-2 podman[75427]: 2026-01-23 09:50:38.108852672 +0000 UTC m=+0.153690755 container remove 8e92715603959a0beed35407b13bd59ae771800bd98b1c2683803b6deaeba39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cranky_roentgen, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 09:50:38 compute-2 systemd[1]: libpod-conmon-8e92715603959a0beed35407b13bd59ae771800bd98b1c2683803b6deaeba39c.scope: Deactivated successfully.
Jan 23 09:50:38 compute-2 podman[75460]: 2026-01-23 09:50:38.187092381 +0000 UTC m=+0.048604335 container create 1d1cab7cce0d31c65f00e07118afae557459716da2437d2034b12bfd7dedb43c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jepsen, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:50:38 compute-2 systemd[1]: Started libpod-conmon-1d1cab7cce0d31c65f00e07118afae557459716da2437d2034b12bfd7dedb43c.scope.
Jan 23 09:50:38 compute-2 systemd[1]: Started libcrun container.
Jan 23 09:50:38 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e200e91b5eec93c149ba7f4c62968dbeb981a25b1b6485af560ee8141f136c90/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:38 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e200e91b5eec93c149ba7f4c62968dbeb981a25b1b6485af560ee8141f136c90/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:38 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e200e91b5eec93c149ba7f4c62968dbeb981a25b1b6485af560ee8141f136c90/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:38 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e200e91b5eec93c149ba7f4c62968dbeb981a25b1b6485af560ee8141f136c90/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:38 compute-2 podman[75460]: 2026-01-23 09:50:38.248025819 +0000 UTC m=+0.109537773 container init 1d1cab7cce0d31c65f00e07118afae557459716da2437d2034b12bfd7dedb43c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jepsen, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 09:50:38 compute-2 podman[75460]: 2026-01-23 09:50:38.255565283 +0000 UTC m=+0.117077237 container start 1d1cab7cce0d31c65f00e07118afae557459716da2437d2034b12bfd7dedb43c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jepsen, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325)
Jan 23 09:50:38 compute-2 podman[75460]: 2026-01-23 09:50:38.259856253 +0000 UTC m=+0.121368407 container attach 1d1cab7cce0d31c65f00e07118afae557459716da2437d2034b12bfd7dedb43c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jepsen, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1)
Jan 23 09:50:38 compute-2 podman[75460]: 2026-01-23 09:50:38.167912677 +0000 UTC m=+0.029424661 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:50:38 compute-2 systemd[1]: libpod-1d1cab7cce0d31c65f00e07118afae557459716da2437d2034b12bfd7dedb43c.scope: Deactivated successfully.
Jan 23 09:50:38 compute-2 podman[75460]: 2026-01-23 09:50:38.354075661 +0000 UTC m=+0.215587615 container died 1d1cab7cce0d31c65f00e07118afae557459716da2437d2034b12bfd7dedb43c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jepsen, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 09:50:38 compute-2 podman[75460]: 2026-01-23 09:50:38.402568302 +0000 UTC m=+0.264080256 container remove 1d1cab7cce0d31c65f00e07118afae557459716da2437d2034b12bfd7dedb43c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jepsen, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True)
Jan 23 09:50:38 compute-2 systemd[1]: libpod-conmon-1d1cab7cce0d31c65f00e07118afae557459716da2437d2034b12bfd7dedb43c.scope: Deactivated successfully.
Jan 23 09:50:38 compute-2 systemd[1]: Reloading.
Jan 23 09:50:38 compute-2 systemd-rc-local-generator[75539]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:50:38 compute-2 systemd-sysv-generator[75542]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:50:38 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:50:38 compute-2 systemd[1]: Reloading.
Jan 23 09:50:38 compute-2 systemd-rc-local-generator[75578]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:50:38 compute-2 systemd-sysv-generator[75582]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:50:38 compute-2 systemd[1]: Reached target All Ceph clusters and services.
Jan 23 09:50:38 compute-2 systemd[1]: Reloading.
Jan 23 09:50:39 compute-2 systemd-rc-local-generator[75617]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:50:39 compute-2 systemd-sysv-generator[75620]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:50:39 compute-2 systemd[1]: Reached target Ceph cluster f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:50:39 compute-2 systemd[1]: Reloading.
Jan 23 09:50:39 compute-2 systemd-rc-local-generator[75656]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:50:39 compute-2 systemd-sysv-generator[75662]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:50:39 compute-2 systemd[1]: Reloading.
Jan 23 09:50:39 compute-2 systemd-rc-local-generator[75694]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:50:39 compute-2 systemd-sysv-generator[75699]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:50:39 compute-2 systemd[1]: Created slice Slice /system/ceph-f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:50:39 compute-2 systemd[1]: Reached target System Time Set.
Jan 23 09:50:39 compute-2 systemd[1]: Reached target System Time Synchronized.
Jan 23 09:50:39 compute-2 systemd[1]: Starting Ceph mon.compute-2 for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:50:39 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:50:39 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:50:39 compute-2 podman[75752]: 2026-01-23 09:50:39.976342718 +0000 UTC m=+0.047558140 container create 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0)
Jan 23 09:50:40 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3af5d6b8d37d6524135efc3a43b1cfcd035993efbedfefadd35921da09b6ddc6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:40 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3af5d6b8d37d6524135efc3a43b1cfcd035993efbedfefadd35921da09b6ddc6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:40 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3af5d6b8d37d6524135efc3a43b1cfcd035993efbedfefadd35921da09b6ddc6/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:40 compute-2 podman[75752]: 2026-01-23 09:50:40.045522448 +0000 UTC m=+0.116737970 container init 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:50:40 compute-2 podman[75752]: 2026-01-23 09:50:39.956588191 +0000 UTC m=+0.027803643 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:50:40 compute-2 podman[75752]: 2026-01-23 09:50:40.052214463 +0000 UTC m=+0.123429935 container start 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Jan 23 09:50:40 compute-2 bash[75752]: 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4
Jan 23 09:50:40 compute-2 systemd[1]: Started Ceph mon.compute-2 for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:50:40 compute-2 ceph-mon[75771]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 09:50:40 compute-2 ceph-mon[75771]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mon, pid 2
Jan 23 09:50:40 compute-2 ceph-mon[75771]: pidfile_write: ignore empty --pid-file
Jan 23 09:50:40 compute-2 ceph-mon[75771]: load: jerasure load: lrc 
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: RocksDB version: 7.9.2
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: Git sha 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: Compile date 2025-07-17 03:12:14
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: DB SUMMARY
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: DB Session ID:  17IZ7DW7X4LNV3P33NJD
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: CURRENT file:  CURRENT
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: IDENTITY file:  IDENTITY
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-2/store.db dir, Total Num: 0, files: 
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-2/store.db: 000004.log size: 511 ; 
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                         Options.error_if_exists: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                       Options.create_if_missing: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                         Options.paranoid_checks: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                                     Options.env: 0x55c650375c20
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                                      Options.fs: PosixFileSystem
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                                Options.info_log: 0x55c65134da20
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                Options.max_file_opening_threads: 16
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                              Options.statistics: (nil)
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                               Options.use_fsync: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                       Options.max_log_file_size: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                         Options.allow_fallocate: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                        Options.use_direct_reads: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:          Options.create_missing_column_families: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                              Options.db_log_dir: 
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                                 Options.wal_dir: 
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                   Options.advise_random_on_open: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                    Options.write_buffer_manager: 0x55c651351900
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                            Options.rate_limiter: (nil)
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                  Options.unordered_write: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                               Options.row_cache: None
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                              Options.wal_filter: None
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:             Options.allow_ingest_behind: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:             Options.two_write_queues: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:             Options.manual_wal_flush: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:             Options.wal_compression: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:             Options.atomic_flush: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                 Options.log_readahead_size: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:             Options.allow_data_in_errors: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:             Options.db_host_id: __hostname__
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:             Options.max_background_jobs: 2
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:             Options.max_background_compactions: -1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:             Options.max_subcompactions: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:             Options.max_total_wal_size: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                          Options.max_open_files: -1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                          Options.bytes_per_sync: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:       Options.compaction_readahead_size: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                  Options.max_background_flushes: -1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: Compression algorithms supported:
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:         kZSTD supported: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:         kXpressCompression supported: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:         kBZip2Compression supported: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:         kZSTDNotFinalCompression supported: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:         kLZ4Compression supported: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:         kZlibCompression supported: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:         kLZ4HCCompression supported: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:         kSnappyCompression supported: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:           Options.merge_operator: 
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:        Options.compaction_filter: None
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c65134d6a0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55c6513709b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:        Options.write_buffer_size: 33554432
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:  Options.max_write_buffer_number: 2
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:          Options.compression: NoCompression
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:             Options.num_levels: 7
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: dbf9ba81-81fe-4d1e-9307-233133587890
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161840104742, "job": 1, "event": "recovery_started", "wal_files": [4]}
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161840106884, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161840107013, "job": 1, "event": "recovery_finished"}
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55c651372e00
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: DB pointer 0x55c651382000
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 09:50:40 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c6513709b0#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 1.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2 does not exist in monmap, will attempt to join an existing cluster
Jan 23 09:50:40 compute-2 ceph-mon[75771]: using public_addr v2:192.168.122.102:0/0 -> [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]
Jan 23 09:50:40 compute-2 ceph-mon[75771]: starting mon.compute-2 rank -1 at public addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] at bind addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-2 fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(???) e0 preinit fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:50:40 compute-2 sudo[75361]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).mds e1 new map
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).mds e1 print_map
                                           e1
                                           btime 2026-01-23T09:47:38:565964+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: -1
                                            
                                           No filesystems configured
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e8 e8: 2 total, 0 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e9 e9: 2 total, 0 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e10 e10: 2 total, 1 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e11 e11: 2 total, 1 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e12 e12: 2 total, 1 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e13 e13: 2 total, 1 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e14 e14: 2 total, 1 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e15 e15: 2 total, 1 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e16 e16: 2 total, 1 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e17 e17: 2 total, 2 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e18 e18: 2 total, 2 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e19 e19: 2 total, 2 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e20 e20: 2 total, 2 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e21 e21: 2 total, 2 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e22 e22: 2 total, 2 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e23 e23: 2 total, 2 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e24 e24: 2 total, 2 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e25 e25: 2 total, 2 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e26 e26: 2 total, 2 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e27 e27: 2 total, 2 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e28 e28: 2 total, 2 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e29 e29: 2 total, 2 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e30 e30: 2 total, 2 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e31 e31: 2 total, 2 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e31 crush map has features 3314933000852226048, adjusting msgr requires
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e31 crush map has features 288514051259236352, adjusting msgr requires
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e31 crush map has features 288514051259236352, adjusting msgr requires
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e31 crush map has features 288514051259236352, adjusting msgr requires
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1144026165' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Jan 23 09:50:40 compute-2 ceph-mon[75771]: osdmap e21: 2 total, 2 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: pgmap v74: 7 pgs: 4 active+clean, 2 creating+peering, 1 unknown; 0 B data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Jan 23 09:50:40 compute-2 ceph-mon[75771]: osdmap e22: 2 total, 2 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1803776421' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1803776421' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Jan 23 09:50:40 compute-2 ceph-mon[75771]: osdmap e23: 2 total, 2 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mgrmap e9: compute-0.nbdygh(active, since 2m)
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: pgmap v77: 7 pgs: 7 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 09:50:40 compute-2 ceph-mon[75771]: osdmap e24: 2 total, 2 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: Health check update: 2 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2193766018' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]': finished
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2193766018' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Jan 23 09:50:40 compute-2 ceph-mon[75771]: osdmap e25: 2 total, 2 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 4.1f scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 4.1f scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2528169956' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2528169956' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Jan 23 09:50:40 compute-2 ceph-mon[75771]: osdmap e26: 2 total, 2 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: pgmap v80: 100 pgs: 2 peering, 93 unknown, 5 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.1e scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.1e scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 3.19 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 3.19 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.1d scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.1d scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 4.10 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 4.10 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: pgmap v82: 162 pgs: 2 peering, 124 unknown, 36 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.1f scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.1f scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 09:50:40 compute-2 ceph-mon[75771]: osdmap e27: 2 total, 2 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 4.1e deep-scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 4.1e deep-scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.1c scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.1c scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: osdmap e28: 2 total, 2 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/29302298' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/29302298' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 3.18 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 3.18 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: pgmap v85: 193 pgs: 93 unknown, 100 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.7 deep-scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.7 deep-scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 3.17 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 3.17 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.9 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.9 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: osdmap e29: 2 total, 2 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 3.16 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: pgmap v87: 193 pgs: 93 unknown, 100 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.8 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.8 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.6 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.6 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 3.16 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 4.11 deep-scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: pgmap v88: 193 pgs: 62 unknown, 131 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.a scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.a scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 4.11 deep-scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 3.15 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2695482257' entity='client.admin' 
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.2 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.2 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 3.15 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 4.12 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: pgmap v89: 193 pgs: 62 unknown, 131 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 4.12 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.0 deep-scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.0 deep-scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='client.14225 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 23 09:50:40 compute-2 ceph-mon[75771]: Saving service ingress.rgw.default spec with placement count:2
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 3.11 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 3.11 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.4 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.4 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 4.14 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 4.14 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: pgmap v90: 193 pgs: 1 active, 1 active+clean+scrubbing, 191 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.1 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.1 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 3.12 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 3.12 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.3 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.3 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 09:50:40 compute-2 ceph-mon[75771]: osdmap e30: 2 total, 2 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 3.1f scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 3.1f scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: osdmap e31: 2 total, 2 up, 2 in
Jan 23 09:50:40 compute-2 ceph-mon[75771]: pgmap v92: 193 pgs: 1 active, 1 active+clean+scrubbing, 191 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 7.1f scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 7.1f scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='client.14227 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: Saving service node-exporter spec with placement *
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 4.19 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 7.1c scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 7.1c scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 4.19 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:40 compute-2 ceph-mon[75771]: Saving service grafana spec with placement compute-0;count:1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: pgmap v94: 193 pgs: 1 active, 192 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 3.1e scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 3.1e scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:40 compute-2 ceph-mon[75771]: Saving service prometheus spec with placement compute-0;count:1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:40 compute-2 ceph-mon[75771]: Saving service alertmanager spec with placement compute-0;count:1
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.18 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.18 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 6.1b scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 6.1b scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.17 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.17 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 6.18 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 6.18 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1143624271' entity='client.admin' 
Jan 23 09:50:40 compute-2 ceph-mon[75771]: pgmap v95: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 7.12 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 7.12 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 5.19 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 5.19 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.16 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.16 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: pgmap v96: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 3.1b scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 3.1b scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 7.11 deep-scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 7.11 deep-scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3906855381' entity='client.admin' 
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 4.1c scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 4.1c scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.14 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.14 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: Updating compute-2:/etc/ceph/ceph.conf
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 5.1d scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 5.1d scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: pgmap v97: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:40 compute-2 ceph-mon[75771]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.12 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.12 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2854364725' entity='client.admin' 
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 6.1f deep-scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 6.1f deep-scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:40 compute-2 ceph-mon[75771]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 7.17 deep-scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 7.17 deep-scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: pgmap v98: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:50:40 compute-2 ceph-mon[75771]: Deploying daemon mon.compute-2 on compute-2
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 4.1d scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 4.1d scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.11 scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 2.11 scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Jan 23 09:50:40 compute-2 ceph-mon[75771]: Cluster is now healthy
Jan 23 09:50:40 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2852887520' entity='client.admin' 
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 6.c scrub starts
Jan 23 09:50:40 compute-2 ceph-mon[75771]: 6.c scrub ok
Jan 23 09:50:40 compute-2 ceph-mon[75771]: pgmap v99: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:40 compute-2 ceph-mon[75771]: mon.compute-2@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Jan 23 09:50:42 compute-2 ceph-mon[75771]: mon.compute-2@-1(probing) e2  my rank is now 1 (was -1)
Jan 23 09:50:42 compute-2 ceph-mon[75771]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Jan 23 09:50:42 compute-2 ceph-mon[75771]: paxos.1).electionLogic(0) init, first boot, initializing epoch at 1 
Jan 23 09:50:42 compute-2 ceph-mon[75771]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 09:50:45 compute-2 ceph-mon[75771]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 09:50:45 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Jan 23 09:50:45 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout,16=squid ondisk layout}
Jan 23 09:50:45 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 09:50:45 compute-2 ceph-mon[75771]: mgrc update_daemon_metadata mon.compute-2 metadata {addrs=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0],arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-2,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026,kernel_version=5.14.0-661.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864316,os=Linux}
Jan 23 09:50:45 compute-2 ceph-mon[75771]: 2.f scrub starts
Jan 23 09:50:45 compute-2 ceph-mon[75771]: 2.f scrub ok
Jan 23 09:50:45 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Jan 23 09:50:45 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 23 09:50:45 compute-2 ceph-mon[75771]: mon.compute-0 calling monitor election
Jan 23 09:50:45 compute-2 ceph-mon[75771]: 4.f scrub starts
Jan 23 09:50:45 compute-2 ceph-mon[75771]: 4.f scrub ok
Jan 23 09:50:45 compute-2 ceph-mon[75771]: 2.b scrub starts
Jan 23 09:50:45 compute-2 ceph-mon[75771]: 2.b scrub ok
Jan 23 09:50:45 compute-2 ceph-mon[75771]: pgmap v100: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:45 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 23 09:50:45 compute-2 ceph-mon[75771]: 4.3 deep-scrub starts
Jan 23 09:50:45 compute-2 ceph-mon[75771]: 4.3 deep-scrub ok
Jan 23 09:50:45 compute-2 ceph-mon[75771]: 7.16 scrub starts
Jan 23 09:50:45 compute-2 ceph-mon[75771]: 7.16 scrub ok
Jan 23 09:50:45 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 23 09:50:45 compute-2 ceph-mon[75771]: mon.compute-2 calling monitor election
Jan 23 09:50:45 compute-2 ceph-mon[75771]: 3.4 scrub starts
Jan 23 09:50:45 compute-2 ceph-mon[75771]: 3.4 scrub ok
Jan 23 09:50:45 compute-2 ceph-mon[75771]: 7.5 deep-scrub starts
Jan 23 09:50:45 compute-2 ceph-mon[75771]: 7.5 deep-scrub ok
Jan 23 09:50:45 compute-2 ceph-mon[75771]: pgmap v101: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:45 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 23 09:50:45 compute-2 ceph-mon[75771]: 6.1 scrub starts
Jan 23 09:50:45 compute-2 ceph-mon[75771]: 6.1 scrub ok
Jan 23 09:50:45 compute-2 ceph-mon[75771]: 7.0 scrub starts
Jan 23 09:50:45 compute-2 ceph-mon[75771]: 7.0 scrub ok
Jan 23 09:50:45 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 23 09:50:45 compute-2 ceph-mon[75771]: 6.6 scrub starts
Jan 23 09:50:45 compute-2 ceph-mon[75771]: 6.6 scrub ok
Jan 23 09:50:45 compute-2 ceph-mon[75771]: pgmap v102: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:45 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 23 09:50:45 compute-2 ceph-mon[75771]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Jan 23 09:50:45 compute-2 ceph-mon[75771]: monmap epoch 2
Jan 23 09:50:45 compute-2 ceph-mon[75771]: fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:50:45 compute-2 ceph-mon[75771]: last_changed 2026-01-23T09:50:40.551249+0000
Jan 23 09:50:45 compute-2 ceph-mon[75771]: created 2026-01-23T09:47:35.499222+0000
Jan 23 09:50:45 compute-2 ceph-mon[75771]: min_mon_release 19 (squid)
Jan 23 09:50:45 compute-2 ceph-mon[75771]: election_strategy: 1
Jan 23 09:50:45 compute-2 ceph-mon[75771]: 0: [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon.compute-0
Jan 23 09:50:45 compute-2 ceph-mon[75771]: 1: [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon.compute-2
Jan 23 09:50:45 compute-2 ceph-mon[75771]: fsmap 
Jan 23 09:50:45 compute-2 ceph-mon[75771]: osdmap e31: 2 total, 2 up, 2 in
Jan 23 09:50:45 compute-2 ceph-mon[75771]: mgrmap e9: compute-0.nbdygh(active, since 2m)
Jan 23 09:50:45 compute-2 ceph-mon[75771]: overall HEALTH_OK
Jan 23 09:50:45 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:50:45 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:47 compute-2 ceph-mon[75771]: 2.5 scrub starts
Jan 23 09:50:47 compute-2 ceph-mon[75771]: 2.5 scrub ok
Jan 23 09:50:47 compute-2 ceph-mon[75771]: Deploying daemon mon.compute-1 on compute-1
Jan 23 09:50:47 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 23 09:50:47 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Jan 23 09:50:47 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Jan 23 09:50:47 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Jan 23 09:50:47 compute-2 ceph-mon[75771]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Jan 23 09:50:47 compute-2 ceph-mon[75771]: paxos.1).electionLogic(10) init, last seen epoch 10
Jan 23 09:50:47 compute-2 ceph-mon[75771]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 09:50:48 compute-2 sudo[75833]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayoodnfmysrbakhkicwmvsfnrbxvxaur ; /usr/bin/python3'
Jan 23 09:50:48 compute-2 sudo[75833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:50:48 compute-2 python3[75835]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a -f 'name=ceph-?(.*)-mgr.*' --format \{\{\.Command\}\} --no-trunc
                                           _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:50:48 compute-2 sudo[75833]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:52 compute-2 ceph-mon[75771]: paxos.1).electionLogic(11) init, last seen epoch 11, mid-election, bumping
Jan 23 09:50:52 compute-2 ceph-mon[75771]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 09:50:52 compute-2 ceph-mon[75771]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 09:50:52 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 3.2 scrub starts
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 3.2 scrub ok
Jan 23 09:50:53 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Jan 23 09:50:53 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 23 09:50:53 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 23 09:50:53 compute-2 ceph-mon[75771]: mon.compute-0 calling monitor election
Jan 23 09:50:53 compute-2 ceph-mon[75771]: mon.compute-2 calling monitor election
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 4.4 scrub starts
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 4.4 scrub ok
Jan 23 09:50:53 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 7.d scrub starts
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 7.d scrub ok
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 4.6 scrub starts
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 4.6 scrub ok
Jan 23 09:50:53 compute-2 ceph-mon[75771]: pgmap v104: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:53 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 23 09:50:53 compute-2 ceph-mon[75771]: mon.compute-1 calling monitor election
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 7.c scrub starts
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 7.c scrub ok
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 3.1 scrub starts
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 3.1 scrub ok
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 2.1a deep-scrub starts
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 2.1a deep-scrub ok
Jan 23 09:50:53 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 6.4 deep-scrub starts
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 6.4 deep-scrub ok
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 7.19 scrub starts
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 7.19 scrub ok
Jan 23 09:50:53 compute-2 ceph-mon[75771]: pgmap v105: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:53 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 4.2 scrub starts
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 4.2 scrub ok
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 7.1a scrub starts
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 7.1a scrub ok
Jan 23 09:50:53 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 6.0 scrub starts
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 6.0 scrub ok
Jan 23 09:50:53 compute-2 ceph-mon[75771]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Jan 23 09:50:53 compute-2 ceph-mon[75771]: monmap epoch 3
Jan 23 09:50:53 compute-2 ceph-mon[75771]: fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:50:53 compute-2 ceph-mon[75771]: last_changed 2026-01-23T09:50:47.540109+0000
Jan 23 09:50:53 compute-2 ceph-mon[75771]: created 2026-01-23T09:47:35.499222+0000
Jan 23 09:50:53 compute-2 ceph-mon[75771]: min_mon_release 19 (squid)
Jan 23 09:50:53 compute-2 ceph-mon[75771]: election_strategy: 1
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 0: [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon.compute-0
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 1: [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon.compute-2
Jan 23 09:50:53 compute-2 ceph-mon[75771]: 2: [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon.compute-1
Jan 23 09:50:53 compute-2 ceph-mon[75771]: fsmap 
Jan 23 09:50:53 compute-2 ceph-mon[75771]: osdmap e31: 2 total, 2 up, 2 in
Jan 23 09:50:53 compute-2 ceph-mon[75771]: mgrmap e9: compute-0.nbdygh(active, since 2m)
Jan 23 09:50:53 compute-2 ceph-mon[75771]: overall HEALTH_OK
Jan 23 09:50:53 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:53 compute-2 sudo[75849]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:50:53 compute-2 sudo[75849]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:53 compute-2 sudo[75849]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:53 compute-2 sudo[75874]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:50:53 compute-2 sudo[75874]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:53 compute-2 podman[75941]: 2026-01-23 09:50:53.588742013 +0000 UTC m=+0.045999820 container create f47c4544ffeddc5df0a5cef2b2e271d916bf7fb494e7aef1691ff7f71bf04e6d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_satoshi, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 09:50:53 compute-2 systemd[72559]: Starting Mark boot as successful...
Jan 23 09:50:53 compute-2 systemd[1]: Started libpod-conmon-f47c4544ffeddc5df0a5cef2b2e271d916bf7fb494e7aef1691ff7f71bf04e6d.scope.
Jan 23 09:50:53 compute-2 systemd[72559]: Finished Mark boot as successful.
Jan 23 09:50:53 compute-2 systemd[1]: Started libcrun container.
Jan 23 09:50:53 compute-2 podman[75941]: 2026-01-23 09:50:53.644073039 +0000 UTC m=+0.101330876 container init f47c4544ffeddc5df0a5cef2b2e271d916bf7fb494e7aef1691ff7f71bf04e6d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_satoshi, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 09:50:53 compute-2 podman[75941]: 2026-01-23 09:50:53.650817889 +0000 UTC m=+0.108075696 container start f47c4544ffeddc5df0a5cef2b2e271d916bf7fb494e7aef1691ff7f71bf04e6d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_satoshi, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Jan 23 09:50:53 compute-2 podman[75941]: 2026-01-23 09:50:53.654879699 +0000 UTC m=+0.112137596 container attach f47c4544ffeddc5df0a5cef2b2e271d916bf7fb494e7aef1691ff7f71bf04e6d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_satoshi, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2)
Jan 23 09:50:53 compute-2 priceless_satoshi[75958]: 167 167
Jan 23 09:50:53 compute-2 systemd[1]: libpod-f47c4544ffeddc5df0a5cef2b2e271d916bf7fb494e7aef1691ff7f71bf04e6d.scope: Deactivated successfully.
Jan 23 09:50:53 compute-2 podman[75941]: 2026-01-23 09:50:53.656750716 +0000 UTC m=+0.114008523 container died f47c4544ffeddc5df0a5cef2b2e271d916bf7fb494e7aef1691ff7f71bf04e6d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_satoshi, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Jan 23 09:50:53 compute-2 podman[75941]: 2026-01-23 09:50:53.567126623 +0000 UTC m=+0.024384460 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:50:53 compute-2 systemd[1]: var-lib-containers-storage-overlay-25153313b40cf1585c20291fc995ff662d6c6077010694753f60df16ac3882eb-merged.mount: Deactivated successfully.
Jan 23 09:50:53 compute-2 podman[75941]: 2026-01-23 09:50:53.694186659 +0000 UTC m=+0.151444466 container remove f47c4544ffeddc5df0a5cef2b2e271d916bf7fb494e7aef1691ff7f71bf04e6d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_satoshi, org.label-schema.vendor=CentOS, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Jan 23 09:50:53 compute-2 systemd[1]: libpod-conmon-f47c4544ffeddc5df0a5cef2b2e271d916bf7fb494e7aef1691ff7f71bf04e6d.scope: Deactivated successfully.
Jan 23 09:50:53 compute-2 systemd[1]: Reloading.
Jan 23 09:50:53 compute-2 systemd-sysv-generator[76003]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:50:53 compute-2 systemd-rc-local-generator[76000]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:50:54 compute-2 systemd[1]: Reloading.
Jan 23 09:50:54 compute-2 systemd-sysv-generator[76044]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:50:54 compute-2 systemd-rc-local-generator[76038]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:50:54 compute-2 systemd[1]: Starting Ceph mgr.compute-2.uczrot for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:50:54 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:54 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:54 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.uczrot", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 23 09:50:54 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.uczrot", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Jan 23 09:50:54 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 09:50:54 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:50:54 compute-2 ceph-mon[75771]: Deploying daemon mgr.compute-2.uczrot on compute-2
Jan 23 09:50:54 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/4282911488' entity='client.admin' 
Jan 23 09:50:54 compute-2 ceph-mon[75771]: 5.18 scrub starts
Jan 23 09:50:54 compute-2 ceph-mon[75771]: 5.18 scrub ok
Jan 23 09:50:54 compute-2 ceph-mon[75771]: pgmap v106: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:54 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 23 09:50:54 compute-2 ceph-mon[75771]: 5.3 scrub starts
Jan 23 09:50:54 compute-2 ceph-mon[75771]: 5.3 scrub ok
Jan 23 09:50:54 compute-2 podman[76100]: 2026-01-23 09:50:54.472107157 +0000 UTC m=+0.044699030 container create 493e3a3dda7766566066c301f8593d7d1e6e8d9c2ba535766866cb6825a13835 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_REF=squid)
Jan 23 09:50:54 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60da7b19c78c8fa12520947c627d4cababc90b9bbf66afd8090d0a7ec372a5ae/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:54 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60da7b19c78c8fa12520947c627d4cababc90b9bbf66afd8090d0a7ec372a5ae/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:54 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60da7b19c78c8fa12520947c627d4cababc90b9bbf66afd8090d0a7ec372a5ae/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:54 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60da7b19c78c8fa12520947c627d4cababc90b9bbf66afd8090d0a7ec372a5ae/merged/var/lib/ceph/mgr/ceph-compute-2.uczrot supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:54 compute-2 podman[76100]: 2026-01-23 09:50:54.52382901 +0000 UTC m=+0.096420893 container init 493e3a3dda7766566066c301f8593d7d1e6e8d9c2ba535766866cb6825a13835 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0)
Jan 23 09:50:54 compute-2 podman[76100]: 2026-01-23 09:50:54.531748857 +0000 UTC m=+0.104340730 container start 493e3a3dda7766566066c301f8593d7d1e6e8d9c2ba535766866cb6825a13835 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 09:50:54 compute-2 bash[76100]: 493e3a3dda7766566066c301f8593d7d1e6e8d9c2ba535766866cb6825a13835
Jan 23 09:50:54 compute-2 podman[76100]: 2026-01-23 09:50:54.453249318 +0000 UTC m=+0.025841211 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:50:54 compute-2 systemd[1]: Started Ceph mgr.compute-2.uczrot for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:50:54 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Jan 23 09:50:54 compute-2 ceph-mgr[76120]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 09:50:54 compute-2 ceph-mgr[76120]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 23 09:50:54 compute-2 ceph-mgr[76120]: pidfile_write: ignore empty --pid-file
Jan 23 09:50:54 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Jan 23 09:50:54 compute-2 sudo[75874]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:54 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'alerts'
Jan 23 09:50:54 compute-2 ceph-mgr[76120]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 09:50:54 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'balancer'
Jan 23 09:50:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:54.722+0000 7f857cdd3140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 09:50:54 compute-2 ceph-mgr[76120]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 09:50:54 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'cephadm'
Jan 23 09:50:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:54.812+0000 7f857cdd3140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 09:50:55 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e31 _set_new_cache_sizes cache_size:1019912092 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:50:55 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'crash'
Jan 23 09:50:55 compute-2 ceph-mgr[76120]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 09:50:55 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'dashboard'
Jan 23 09:50:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:55.730+0000 7f857cdd3140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 09:50:56 compute-2 ceph-mon[75771]: 4.18 deep-scrub starts
Jan 23 09:50:56 compute-2 ceph-mon[75771]: 4.18 deep-scrub ok
Jan 23 09:50:56 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:56 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:56 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3189222711' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Jan 23 09:50:56 compute-2 ceph-mon[75771]: 3.6 scrub starts
Jan 23 09:50:56 compute-2 ceph-mon[75771]: 3.6 scrub ok
Jan 23 09:50:56 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:56 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.jmakme", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 23 09:50:56 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.jmakme", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Jan 23 09:50:56 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 09:50:56 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:50:56 compute-2 ceph-mon[75771]: Deploying daemon mgr.compute-1.jmakme on compute-1
Jan 23 09:50:56 compute-2 ceph-mon[75771]: pgmap v107: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:56 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'devicehealth'
Jan 23 09:50:56 compute-2 ceph-mgr[76120]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 09:50:56 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'diskprediction_local'
Jan 23 09:50:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:56.424+0000 7f857cdd3140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 09:50:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 23 09:50:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 23 09:50:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]:   from numpy import show_config as show_numpy_config
Jan 23 09:50:56 compute-2 ceph-mgr[76120]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 09:50:56 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'influx'
Jan 23 09:50:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:56.608+0000 7f857cdd3140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 09:50:56 compute-2 ceph-mgr[76120]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 09:50:56 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'insights'
Jan 23 09:50:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:56.680+0000 7f857cdd3140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 09:50:56 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'iostat'
Jan 23 09:50:56 compute-2 ceph-mgr[76120]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 09:50:56 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'k8sevents'
Jan 23 09:50:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:56.824+0000 7f857cdd3140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 09:50:56 compute-2 sudo[76152]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:50:56 compute-2 sudo[76152]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:56 compute-2 sudo[76152]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:57 compute-2 sudo[76177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:50:57 compute-2 sudo[76177]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:57 compute-2 ceph-mon[75771]: 6.19 scrub starts
Jan 23 09:50:57 compute-2 ceph-mon[75771]: 6.19 scrub ok
Jan 23 09:50:57 compute-2 ceph-mon[75771]: 5.0 scrub starts
Jan 23 09:50:57 compute-2 ceph-mon[75771]: 5.0 scrub ok
Jan 23 09:50:57 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3189222711' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Jan 23 09:50:57 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:57 compute-2 ceph-mon[75771]: mgrmap e10: compute-0.nbdygh(active, since 2m)
Jan 23 09:50:57 compute-2 ceph-mon[75771]: 3.1c scrub starts
Jan 23 09:50:57 compute-2 ceph-mon[75771]: 3.1c scrub ok
Jan 23 09:50:57 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:57 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:57 compute-2 ceph-mon[75771]: 3.7 deep-scrub starts
Jan 23 09:50:57 compute-2 ceph-mon[75771]: 3.7 deep-scrub ok
Jan 23 09:50:57 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:57 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:57 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 23 09:50:57 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Jan 23 09:50:57 compute-2 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:50:57 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1618362368' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Jan 23 09:50:57 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'localpool'
Jan 23 09:50:57 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'mds_autoscaler'
Jan 23 09:50:57 compute-2 podman[76242]: 2026-01-23 09:50:57.457640599 +0000 UTC m=+0.040714792 container create 71be559bbfdca95934b4dadc76f3f5bcc3a400d48a547b3dac1e6b29cc027d28 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=mystifying_dewdney, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Jan 23 09:50:57 compute-2 systemd[1]: Started libpod-conmon-71be559bbfdca95934b4dadc76f3f5bcc3a400d48a547b3dac1e6b29cc027d28.scope.
Jan 23 09:50:57 compute-2 systemd[1]: Started libcrun container.
Jan 23 09:50:57 compute-2 podman[76242]: 2026-01-23 09:50:57.438879342 +0000 UTC m=+0.021953555 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:50:57 compute-2 podman[76242]: 2026-01-23 09:50:57.537214383 +0000 UTC m=+0.120288606 container init 71be559bbfdca95934b4dadc76f3f5bcc3a400d48a547b3dac1e6b29cc027d28 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=mystifying_dewdney, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True)
Jan 23 09:50:57 compute-2 podman[76242]: 2026-01-23 09:50:57.543872982 +0000 UTC m=+0.126947175 container start 71be559bbfdca95934b4dadc76f3f5bcc3a400d48a547b3dac1e6b29cc027d28 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=mystifying_dewdney, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Jan 23 09:50:57 compute-2 podman[76242]: 2026-01-23 09:50:57.548094144 +0000 UTC m=+0.131168437 container attach 71be559bbfdca95934b4dadc76f3f5bcc3a400d48a547b3dac1e6b29cc027d28 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=mystifying_dewdney, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 09:50:57 compute-2 mystifying_dewdney[76259]: 167 167
Jan 23 09:50:57 compute-2 systemd[1]: libpod-71be559bbfdca95934b4dadc76f3f5bcc3a400d48a547b3dac1e6b29cc027d28.scope: Deactivated successfully.
Jan 23 09:50:57 compute-2 podman[76242]: 2026-01-23 09:50:57.550740402 +0000 UTC m=+0.133814595 container died 71be559bbfdca95934b4dadc76f3f5bcc3a400d48a547b3dac1e6b29cc027d28 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=mystifying_dewdney, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.license=GPLv2)
Jan 23 09:50:57 compute-2 systemd[1]: var-lib-containers-storage-overlay-60b98240dd3187f16bba69540ab298071e9949c26218fba674f832b87f664bae-merged.mount: Deactivated successfully.
Jan 23 09:50:57 compute-2 podman[76242]: 2026-01-23 09:50:57.58379677 +0000 UTC m=+0.166870963 container remove 71be559bbfdca95934b4dadc76f3f5bcc3a400d48a547b3dac1e6b29cc027d28 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=mystifying_dewdney, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0)
Jan 23 09:50:57 compute-2 systemd[1]: libpod-conmon-71be559bbfdca95934b4dadc76f3f5bcc3a400d48a547b3dac1e6b29cc027d28.scope: Deactivated successfully.
Jan 23 09:50:57 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'mirroring'
Jan 23 09:50:57 compute-2 systemd[1]: Reloading.
Jan 23 09:50:57 compute-2 systemd-sysv-generator[76304]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:50:57 compute-2 systemd-rc-local-generator[76301]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:50:57 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'nfs'
Jan 23 09:50:57 compute-2 systemd[1]: Reloading.
Jan 23 09:50:57 compute-2 ceph-mgr[76120]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 09:50:57 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'orchestrator'
Jan 23 09:50:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:57.986+0000 7f857cdd3140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 09:50:57 compute-2 systemd-rc-local-generator[76343]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:50:58 compute-2 systemd-sysv-generator[76349]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:50:58 compute-2 systemd[1]: Starting Ceph crash.compute-2 for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:50:58 compute-2 ceph-mgr[76120]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 09:50:58 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'osd_perf_query'
Jan 23 09:50:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:58.225+0000 7f857cdd3140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 09:50:58 compute-2 ceph-mon[75771]: Deploying daemon crash.compute-2 on compute-2
Jan 23 09:50:58 compute-2 ceph-mon[75771]: pgmap v108: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:58 compute-2 ceph-mon[75771]: 6.1a scrub starts
Jan 23 09:50:58 compute-2 ceph-mon[75771]: 6.1a scrub ok
Jan 23 09:50:58 compute-2 ceph-mgr[76120]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 09:50:58 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'osd_support'
Jan 23 09:50:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:58.309+0000 7f857cdd3140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 09:50:58 compute-2 sshd-session[72724]: Connection closed by 192.168.122.100 port 37078
Jan 23 09:50:58 compute-2 sshd-session[72578]: Connection closed by 192.168.122.100 port 37016
Jan 23 09:50:58 compute-2 sshd-session[72579]: Connection closed by 192.168.122.100 port 37018
Jan 23 09:50:58 compute-2 sshd-session[72695]: Connection closed by 192.168.122.100 port 37064
Jan 23 09:50:58 compute-2 sshd-session[72811]: Connection closed by 192.168.122.100 port 52868
Jan 23 09:50:58 compute-2 sshd-session[72867]: Connection closed by 192.168.122.100 port 52892
Jan 23 09:50:58 compute-2 sshd-session[72666]: Connection closed by 192.168.122.100 port 37050
Jan 23 09:50:58 compute-2 sshd-session[72838]: Connection closed by 192.168.122.100 port 52876
Jan 23 09:50:58 compute-2 sshd-session[72782]: Connection closed by 192.168.122.100 port 52858
Jan 23 09:50:58 compute-2 sshd-session[72608]: Connection closed by 192.168.122.100 port 37028
Jan 23 09:50:58 compute-2 sshd-session[72753]: Connection closed by 192.168.122.100 port 52848
Jan 23 09:50:58 compute-2 sshd-session[72637]: Connection closed by 192.168.122.100 port 37040
Jan 23 09:50:58 compute-2 sshd-session[72692]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:50:58 compute-2 sshd-session[72555]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:50:58 compute-2 systemd[1]: session-26.scope: Deactivated successfully.
Jan 23 09:50:58 compute-2 systemd[1]: session-20.scope: Deactivated successfully.
Jan 23 09:50:58 compute-2 sshd-session[72663]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:50:58 compute-2 systemd[1]: session-25.scope: Deactivated successfully.
Jan 23 09:50:58 compute-2 systemd-logind[786]: Session 26 logged out. Waiting for processes to exit.
Jan 23 09:50:58 compute-2 systemd-logind[786]: Session 20 logged out. Waiting for processes to exit.
Jan 23 09:50:58 compute-2 systemd-logind[786]: Session 25 logged out. Waiting for processes to exit.
Jan 23 09:50:58 compute-2 sshd-session[72864]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:50:58 compute-2 sshd-session[72721]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:50:58 compute-2 sshd-session[72605]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:50:58 compute-2 sshd-session[72835]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:50:58 compute-2 sshd-session[72573]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:50:58 compute-2 systemd[1]: session-27.scope: Deactivated successfully.
Jan 23 09:50:58 compute-2 sshd-session[72750]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:50:58 compute-2 systemd[1]: session-31.scope: Deactivated successfully.
Jan 23 09:50:58 compute-2 systemd[1]: session-23.scope: Deactivated successfully.
Jan 23 09:50:58 compute-2 systemd[1]: session-22.scope: Deactivated successfully.
Jan 23 09:50:58 compute-2 systemd[1]: session-28.scope: Deactivated successfully.
Jan 23 09:50:58 compute-2 systemd-logind[786]: Removed session 26.
Jan 23 09:50:58 compute-2 systemd-logind[786]: Session 27 logged out. Waiting for processes to exit.
Jan 23 09:50:58 compute-2 systemd-logind[786]: Session 32 logged out. Waiting for processes to exit.
Jan 23 09:50:58 compute-2 systemd-logind[786]: Session 22 logged out. Waiting for processes to exit.
Jan 23 09:50:58 compute-2 systemd-logind[786]: Session 31 logged out. Waiting for processes to exit.
Jan 23 09:50:58 compute-2 systemd-logind[786]: Session 23 logged out. Waiting for processes to exit.
Jan 23 09:50:58 compute-2 systemd-logind[786]: Session 28 logged out. Waiting for processes to exit.
Jan 23 09:50:58 compute-2 systemd-logind[786]: Removed session 20.
Jan 23 09:50:58 compute-2 sshd-session[72634]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:50:58 compute-2 systemd-logind[786]: Removed session 25.
Jan 23 09:50:58 compute-2 sshd-session[72779]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:50:58 compute-2 sshd-session[72808]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:50:58 compute-2 systemd-logind[786]: Session 24 logged out. Waiting for processes to exit.
Jan 23 09:50:58 compute-2 systemd[1]: session-24.scope: Deactivated successfully.
Jan 23 09:50:58 compute-2 systemd[1]: session-29.scope: Deactivated successfully.
Jan 23 09:50:58 compute-2 systemd[1]: session-30.scope: Deactivated successfully.
Jan 23 09:50:58 compute-2 systemd-logind[786]: Session 29 logged out. Waiting for processes to exit.
Jan 23 09:50:58 compute-2 systemd-logind[786]: Session 30 logged out. Waiting for processes to exit.
Jan 23 09:50:58 compute-2 systemd-logind[786]: Removed session 27.
Jan 23 09:50:58 compute-2 systemd-logind[786]: Removed session 31.
Jan 23 09:50:58 compute-2 ceph-mgr[76120]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 09:50:58 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'pg_autoscaler'
Jan 23 09:50:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:58.389+0000 7f857cdd3140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 09:50:58 compute-2 systemd-logind[786]: Removed session 23.
Jan 23 09:50:58 compute-2 systemd-logind[786]: Removed session 22.
Jan 23 09:50:58 compute-2 systemd-logind[786]: Removed session 28.
Jan 23 09:50:58 compute-2 systemd-logind[786]: Removed session 24.
Jan 23 09:50:58 compute-2 systemd-logind[786]: Removed session 29.
Jan 23 09:50:58 compute-2 systemd-logind[786]: Removed session 30.
Jan 23 09:50:58 compute-2 podman[76399]: 2026-01-23 09:50:58.398090596 +0000 UTC m=+0.042916945 container create 044486c85d2f7920782c0ee61f8358742d2c669d4f1247ecac174b4901e18cb4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 09:50:58 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8afc076c02f5da977f90fdad2cc82748c899bf1be65a6b94ef039d15298fdf4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:58 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8afc076c02f5da977f90fdad2cc82748c899bf1be65a6b94ef039d15298fdf4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:58 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8afc076c02f5da977f90fdad2cc82748c899bf1be65a6b94ef039d15298fdf4/merged/etc/ceph/ceph.client.crash.compute-2.keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:58 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8afc076c02f5da977f90fdad2cc82748c899bf1be65a6b94ef039d15298fdf4/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:58 compute-2 podman[76399]: 2026-01-23 09:50:58.461494901 +0000 UTC m=+0.106321270 container init 044486c85d2f7920782c0ee61f8358742d2c669d4f1247ecac174b4901e18cb4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Jan 23 09:50:58 compute-2 podman[76399]: 2026-01-23 09:50:58.466672138 +0000 UTC m=+0.111498487 container start 044486c85d2f7920782c0ee61f8358742d2c669d4f1247ecac174b4901e18cb4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid)
Jan 23 09:50:58 compute-2 bash[76399]: 044486c85d2f7920782c0ee61f8358742d2c669d4f1247ecac174b4901e18cb4
Jan 23 09:50:58 compute-2 podman[76399]: 2026-01-23 09:50:58.37808025 +0000 UTC m=+0.022906619 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:50:58 compute-2 systemd[1]: Started Ceph crash.compute-2 for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:50:58 compute-2 ceph-mgr[76120]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 09:50:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:58.477+0000 7f857cdd3140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 09:50:58 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'progress'
Jan 23 09:50:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2[76416]: INFO:ceph-crash:pinging cluster to exercise our key
Jan 23 09:50:58 compute-2 sudo[76177]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:58 compute-2 systemd[1]: session-32.scope: Deactivated successfully.
Jan 23 09:50:58 compute-2 systemd[1]: session-32.scope: Consumed 1min 22.769s CPU time.
Jan 23 09:50:58 compute-2 systemd-logind[786]: Removed session 32.
Jan 23 09:50:58 compute-2 ceph-mgr[76120]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 09:50:58 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'prometheus'
Jan 23 09:50:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:58.553+0000 7f857cdd3140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 09:50:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2[76416]: 2026-01-23T09:50:58.614+0000 7f57c5bd8640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 23 09:50:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2[76416]: 2026-01-23T09:50:58.614+0000 7f57c5bd8640 -1 AuthRegistry(0x7f57c0069b10) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 23 09:50:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2[76416]: 2026-01-23T09:50:58.616+0000 7f57c5bd8640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 23 09:50:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2[76416]: 2026-01-23T09:50:58.616+0000 7f57c5bd8640 -1 AuthRegistry(0x7f57c5bd6ff0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 23 09:50:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2[76416]: 2026-01-23T09:50:58.617+0000 7f57bffff640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 23 09:50:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2[76416]: 2026-01-23T09:50:58.618+0000 7f57beffd640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 23 09:50:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2[76416]: 2026-01-23T09:50:58.619+0000 7f57bf7fe640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 23 09:50:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2[76416]: 2026-01-23T09:50:58.619+0000 7f57c5bd8640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Jan 23 09:50:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2[76416]: [errno 13] RADOS permission denied (error connecting to the cluster)
Jan 23 09:50:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2[76416]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Jan 23 09:50:58 compute-2 ceph-mgr[76120]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 09:50:58 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'rbd_support'
Jan 23 09:50:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:58.935+0000 7f857cdd3140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 09:50:59 compute-2 ceph-mgr[76120]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 09:50:59 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'restful'
Jan 23 09:50:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:59.044+0000 7f857cdd3140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 09:50:59 compute-2 ceph-mon[75771]: 4.0 scrub starts
Jan 23 09:50:59 compute-2 ceph-mon[75771]: 4.0 scrub ok
Jan 23 09:50:59 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1618362368' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Jan 23 09:50:59 compute-2 ceph-mon[75771]: mgrmap e11: compute-0.nbdygh(active, since 2m)
Jan 23 09:50:59 compute-2 ceph-mon[75771]: 5.1a deep-scrub starts
Jan 23 09:50:59 compute-2 ceph-mon[75771]: 5.1a deep-scrub ok
Jan 23 09:50:59 compute-2 ceph-mon[75771]: 4.7 deep-scrub starts
Jan 23 09:50:59 compute-2 ceph-mon[75771]: 4.7 deep-scrub ok
Jan 23 09:50:59 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'rgw'
Jan 23 09:50:59 compute-2 ceph-mgr[76120]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 09:50:59 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'rook'
Jan 23 09:50:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:59.582+0000 7f857cdd3140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 09:51:00 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e31 _set_new_cache_sizes cache_size:1020052805 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:51:00 compute-2 ceph-mgr[76120]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 09:51:00 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'selftest'
Jan 23 09:51:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:00.218+0000 7f857cdd3140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 09:51:00 compute-2 ceph-mgr[76120]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 09:51:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:00.295+0000 7f857cdd3140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 09:51:00 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'snap_schedule'
Jan 23 09:51:00 compute-2 ceph-mgr[76120]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 09:51:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:00.378+0000 7f857cdd3140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 09:51:00 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'stats'
Jan 23 09:51:00 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'status'
Jan 23 09:51:00 compute-2 ceph-mgr[76120]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 09:51:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:00.557+0000 7f857cdd3140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 09:51:00 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'telegraf'
Jan 23 09:51:00 compute-2 ceph-mgr[76120]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 09:51:00 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'telemetry'
Jan 23 09:51:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:00.650+0000 7f857cdd3140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 09:51:00 compute-2 ceph-mgr[76120]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 09:51:00 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'test_orchestrator'
Jan 23 09:51:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:00.847+0000 7f857cdd3140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 09:51:01 compute-2 ceph-mgr[76120]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:01 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'volumes'
Jan 23 09:51:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:01.114+0000 7f857cdd3140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:01 compute-2 ceph-mgr[76120]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 09:51:01 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'zabbix'
Jan 23 09:51:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:01.498+0000 7f857cdd3140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 09:51:01 compute-2 ceph-mgr[76120]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 09:51:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:01.585+0000 7f857cdd3140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 09:51:01 compute-2 ceph-mgr[76120]: ms_deliver_dispatch: unhandled message 0x55f098898d00 mon_map magic: 0 from mon.0 v2:192.168.122.100:3300/0
Jan 23 09:51:01 compute-2 ceph-mgr[76120]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 23 09:51:01 compute-2 ceph-mgr[76120]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 23 09:51:01 compute-2 ceph-mgr[76120]: mgr respawn  0: '/usr/bin/ceph-mgr'
Jan 23 09:51:01 compute-2 ceph-mgr[76120]: mgr respawn  1: '-n'
Jan 23 09:51:01 compute-2 ceph-mgr[76120]: mgr respawn  2: 'mgr.compute-2.uczrot'
Jan 23 09:51:01 compute-2 ceph-mgr[76120]: mgr respawn  3: '-f'
Jan 23 09:51:01 compute-2 ceph-mgr[76120]: mgr respawn  4: '--setuser'
Jan 23 09:51:01 compute-2 ceph-mgr[76120]: mgr respawn  5: 'ceph'
Jan 23 09:51:01 compute-2 ceph-mgr[76120]: mgr respawn  6: '--setgroup'
Jan 23 09:51:01 compute-2 ceph-mgr[76120]: mgr respawn  7: 'ceph'
Jan 23 09:51:01 compute-2 ceph-mgr[76120]: mgr respawn  8: '--default-log-to-file=false'
Jan 23 09:51:01 compute-2 ceph-mgr[76120]: mgr respawn  9: '--default-log-to-journald=true'
Jan 23 09:51:01 compute-2 ceph-mgr[76120]: mgr respawn  10: '--default-log-to-stderr=false'
Jan 23 09:51:01 compute-2 ceph-mgr[76120]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Jan 23 09:51:01 compute-2 ceph-mgr[76120]: mgr respawn  exe_path /proc/self/exe
Jan 23 09:51:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: ignoring --setuser ceph since I am not root
Jan 23 09:51:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: ignoring --setgroup ceph since I am not root
Jan 23 09:51:02 compute-2 ceph-mon[75771]: 4.1b scrub starts
Jan 23 09:51:02 compute-2 ceph-mon[75771]: 4.1b scrub ok
Jan 23 09:51:02 compute-2 ceph-mon[75771]: 3.0 deep-scrub starts
Jan 23 09:51:02 compute-2 ceph-mon[75771]: 3.0 deep-scrub ok
Jan 23 09:51:02 compute-2 ceph-mgr[76120]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 23 09:51:02 compute-2 ceph-mgr[76120]: pidfile_write: ignore empty --pid-file
Jan 23 09:51:02 compute-2 ceph-mon[75771]: 5.1b scrub starts
Jan 23 09:51:02 compute-2 ceph-mon[75771]: 5.1b scrub ok
Jan 23 09:51:02 compute-2 ceph-mon[75771]: 5.6 scrub starts
Jan 23 09:51:02 compute-2 ceph-mon[75771]: 5.6 scrub ok
Jan 23 09:51:02 compute-2 ceph-mon[75771]: 3.1d scrub starts
Jan 23 09:51:02 compute-2 ceph-mon[75771]: 3.1d scrub ok
Jan 23 09:51:02 compute-2 ceph-mon[75771]: Standby manager daemon compute-2.uczrot started
Jan 23 09:51:03 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'alerts'
Jan 23 09:51:03 compute-2 ceph-mgr[76120]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 09:51:03 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'balancer'
Jan 23 09:51:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:03.513+0000 7f3a29999140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 09:51:03 compute-2 ceph-mgr[76120]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 09:51:03 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'cephadm'
Jan 23 09:51:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:03.601+0000 7f3a29999140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 09:51:03 compute-2 ceph-mon[75771]: 5.c scrub starts
Jan 23 09:51:03 compute-2 ceph-mon[75771]: 5.c scrub ok
Jan 23 09:51:03 compute-2 ceph-mon[75771]: 4.1a scrub starts
Jan 23 09:51:03 compute-2 ceph-mon[75771]: 4.1a scrub ok
Jan 23 09:51:04 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'crash'
Jan 23 09:51:04 compute-2 ceph-mgr[76120]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 09:51:04 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'dashboard'
Jan 23 09:51:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:04.592+0000 7f3a29999140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 09:51:04 compute-2 ceph-mon[75771]: 6.f scrub starts
Jan 23 09:51:04 compute-2 ceph-mon[75771]: 6.f scrub ok
Jan 23 09:51:04 compute-2 ceph-mon[75771]: 6.d scrub starts
Jan 23 09:51:04 compute-2 ceph-mon[75771]: 6.d scrub ok
Jan 23 09:51:04 compute-2 ceph-mon[75771]: mgrmap e12: compute-0.nbdygh(active, since 2m), standbys: compute-2.uczrot
Jan 23 09:51:04 compute-2 ceph-mon[75771]: 5.1c deep-scrub starts
Jan 23 09:51:04 compute-2 ceph-mon[75771]: 5.1c deep-scrub ok
Jan 23 09:51:05 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e31 _set_new_cache_sizes cache_size:1020054705 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:51:05 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'devicehealth'
Jan 23 09:51:05 compute-2 ceph-mgr[76120]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 09:51:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:05.432+0000 7f3a29999140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 09:51:05 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'diskprediction_local'
Jan 23 09:51:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 23 09:51:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 23 09:51:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]:   from numpy import show_config as show_numpy_config
Jan 23 09:51:05 compute-2 ceph-mgr[76120]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 09:51:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:05.641+0000 7f3a29999140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 09:51:05 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'influx'
Jan 23 09:51:05 compute-2 ceph-mgr[76120]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 09:51:05 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'insights'
Jan 23 09:51:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:05.726+0000 7f3a29999140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 09:51:05 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'iostat'
Jan 23 09:51:05 compute-2 ceph-mgr[76120]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 09:51:05 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'k8sevents'
Jan 23 09:51:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:05.891+0000 7f3a29999140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 09:51:05 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e32 e32: 2 total, 2 up, 2 in
Jan 23 09:51:06 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'localpool'
Jan 23 09:51:06 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'mds_autoscaler'
Jan 23 09:51:06 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'mirroring'
Jan 23 09:51:06 compute-2 sshd-session[76470]: Accepted publickey for ceph-admin from 192.168.122.100 port 56054 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:51:06 compute-2 systemd-logind[786]: New session 33 of user ceph-admin.
Jan 23 09:51:06 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'nfs'
Jan 23 09:51:06 compute-2 systemd[1]: Started Session 33 of User ceph-admin.
Jan 23 09:51:06 compute-2 sshd-session[76470]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:51:06 compute-2 sudo[76474]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:51:06 compute-2 sudo[76474]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:06 compute-2 sudo[76474]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:06 compute-2 sudo[76499]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 23 09:51:06 compute-2 sudo[76499]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:07 compute-2 ceph-mgr[76120]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 09:51:07 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'orchestrator'
Jan 23 09:51:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:07.033+0000 7f3a29999140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 09:51:07 compute-2 ceph-mgr[76120]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:07 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'osd_perf_query'
Jan 23 09:51:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:07.290+0000 7f3a29999140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:07 compute-2 ceph-mgr[76120]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 09:51:07 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'osd_support'
Jan 23 09:51:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:07.371+0000 7f3a29999140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 09:51:07 compute-2 ceph-mgr[76120]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 09:51:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:07.448+0000 7f3a29999140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 09:51:07 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'pg_autoscaler'
Jan 23 09:51:07 compute-2 ceph-mgr[76120]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 09:51:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:07.543+0000 7f3a29999140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 09:51:07 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'progress'
Jan 23 09:51:07 compute-2 ceph-mgr[76120]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 09:51:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:07.621+0000 7f3a29999140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 09:51:07 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'prometheus'
Jan 23 09:51:07 compute-2 ceph-mon[75771]: 3.b scrub starts
Jan 23 09:51:07 compute-2 ceph-mon[75771]: 3.b scrub ok
Jan 23 09:51:07 compute-2 ceph-mon[75771]: 5.d scrub starts
Jan 23 09:51:07 compute-2 ceph-mon[75771]: 5.d scrub ok
Jan 23 09:51:07 compute-2 ceph-mon[75771]: 6.e scrub starts
Jan 23 09:51:07 compute-2 ceph-mon[75771]: 6.e scrub ok
Jan 23 09:51:07 compute-2 ceph-mon[75771]: Active manager daemon compute-0.nbdygh restarted
Jan 23 09:51:07 compute-2 ceph-mon[75771]: Activating manager daemon compute-0.nbdygh
Jan 23 09:51:07 compute-2 ceph-mon[75771]: osdmap e32: 2 total, 2 up, 2 in
Jan 23 09:51:07 compute-2 ceph-mon[75771]: mgrmap e13: compute-0.nbdygh(active, starting, since 0.407511s), standbys: compute-2.uczrot
Jan 23 09:51:07 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Jan 23 09:51:07 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 23 09:51:07 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 23 09:51:07 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr metadata", "who": "compute-0.nbdygh", "id": "compute-0.nbdygh"}]: dispatch
Jan 23 09:51:07 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr metadata", "who": "compute-2.uczrot", "id": "compute-2.uczrot"}]: dispatch
Jan 23 09:51:07 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Jan 23 09:51:07 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Jan 23 09:51:07 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mds metadata"}]: dispatch
Jan 23 09:51:07 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 09:51:07 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata"}]: dispatch
Jan 23 09:51:07 compute-2 podman[76594]: 2026-01-23 09:51:07.663429466 +0000 UTC m=+0.212839942 container exec 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True)
Jan 23 09:51:07 compute-2 podman[76594]: 2026-01-23 09:51:07.772199301 +0000 UTC m=+0.321609777 container exec_died 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 09:51:08 compute-2 sudo[76499]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:08 compute-2 ceph-mgr[76120]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 09:51:08 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'rbd_support'
Jan 23 09:51:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:08.038+0000 7f3a29999140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 09:51:08 compute-2 sudo[76661]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:51:08 compute-2 sudo[76661]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:08 compute-2 sudo[76661]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:08 compute-2 ceph-mgr[76120]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 09:51:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:08.153+0000 7f3a29999140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 09:51:08 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'restful'
Jan 23 09:51:08 compute-2 sudo[76686]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 09:51:08 compute-2 sudo[76686]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:08 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'rgw'
Jan 23 09:51:08 compute-2 ceph-mgr[76120]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 09:51:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:08.666+0000 7f3a29999140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 09:51:08 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'rook'
Jan 23 09:51:08 compute-2 ceph-mon[75771]: 5.a scrub starts
Jan 23 09:51:08 compute-2 ceph-mon[75771]: 5.a scrub ok
Jan 23 09:51:08 compute-2 ceph-mon[75771]: Manager daemon compute-0.nbdygh is now available
Jan 23 09:51:08 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.nbdygh/mirror_snapshot_schedule"}]: dispatch
Jan 23 09:51:08 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.nbdygh/trash_purge_schedule"}]: dispatch
Jan 23 09:51:08 compute-2 ceph-mon[75771]: 4.c deep-scrub starts
Jan 23 09:51:08 compute-2 ceph-mon[75771]: 4.c deep-scrub ok
Jan 23 09:51:08 compute-2 ceph-mon[75771]: 6.9 deep-scrub starts
Jan 23 09:51:08 compute-2 ceph-mon[75771]: 6.9 deep-scrub ok
Jan 23 09:51:08 compute-2 ceph-mon[75771]: 4.e scrub starts
Jan 23 09:51:08 compute-2 ceph-mon[75771]: 4.e scrub ok
Jan 23 09:51:08 compute-2 ceph-mon[75771]: mgrmap e14: compute-0.nbdygh(active, since 2s), standbys: compute-2.uczrot
Jan 23 09:51:08 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:08 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:08 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:08 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:08 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:08 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:08 compute-2 sudo[76686]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:08 compute-2 sudo[76744]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:51:08 compute-2 sudo[76744]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:08 compute-2 sudo[76744]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:08 compute-2 sudo[76769]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Jan 23 09:51:08 compute-2 sudo[76769]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:09 compute-2 sudo[76769]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:09 compute-2 ceph-mgr[76120]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 09:51:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:09.367+0000 7f3a29999140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 09:51:09 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'selftest'
Jan 23 09:51:09 compute-2 ceph-mgr[76120]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 09:51:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:09.451+0000 7f3a29999140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 09:51:09 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'snap_schedule'
Jan 23 09:51:09 compute-2 ceph-mgr[76120]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 09:51:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:09.548+0000 7f3a29999140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 09:51:09 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'stats'
Jan 23 09:51:09 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'status'
Jan 23 09:51:09 compute-2 ceph-mgr[76120]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 09:51:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:09.736+0000 7f3a29999140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 09:51:09 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'telegraf'
Jan 23 09:51:09 compute-2 ceph-mgr[76120]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 09:51:09 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'telemetry'
Jan 23 09:51:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:09.818+0000 7f3a29999140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 09:51:09 compute-2 ceph-mon[75771]: [23/Jan/2026:09:51:07] ENGINE Bus STARTING
Jan 23 09:51:09 compute-2 ceph-mon[75771]: [23/Jan/2026:09:51:07] ENGINE Serving on http://192.168.122.100:8765
Jan 23 09:51:09 compute-2 ceph-mon[75771]: [23/Jan/2026:09:51:07] ENGINE Serving on https://192.168.122.100:7150
Jan 23 09:51:09 compute-2 ceph-mon[75771]: [23/Jan/2026:09:51:07] ENGINE Bus STARTED
Jan 23 09:51:09 compute-2 ceph-mon[75771]: [23/Jan/2026:09:51:07] ENGINE Client ('192.168.122.100', 55612) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Jan 23 09:51:09 compute-2 ceph-mon[75771]: 4.b deep-scrub starts
Jan 23 09:51:09 compute-2 ceph-mon[75771]: 4.b deep-scrub ok
Jan 23 09:51:09 compute-2 ceph-mon[75771]: pgmap v4: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:51:09 compute-2 ceph-mon[75771]: 5.f scrub starts
Jan 23 09:51:09 compute-2 ceph-mon[75771]: 5.f scrub ok
Jan 23 09:51:09 compute-2 ceph-mon[75771]: from='client.14304 -' entity='client.admin' cmd=[{"prefix": "dashboard set-grafana-api-password", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 09:51:09 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:09 compute-2 ceph-mgr[76120]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 09:51:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:09.986+0000 7f3a29999140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 09:51:09 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'test_orchestrator'
Jan 23 09:51:09 compute-2 sudo[76812]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Jan 23 09:51:09 compute-2 sudo[76812]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:09 compute-2 sudo[76812]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-2 sudo[76837]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph
Jan 23 09:51:10 compute-2 sudo[76837]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-2 sudo[76837]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-2 sudo[76862]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:51:10 compute-2 sudo[76862]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-2 sudo[76862]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e32 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:51:10 compute-2 sudo[76887]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:51:10 compute-2 sudo[76887]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-2 sudo[76887]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-2 sudo[76912]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:51:10 compute-2 sudo[76912]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-2 sudo[76912]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-2 ceph-mgr[76120]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:10 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'volumes'
Jan 23 09:51:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:10.239+0000 7f3a29999140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:10 compute-2 sudo[76960]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:51:10 compute-2 sudo[76960]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-2 sudo[76960]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-2 sudo[76985]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:51:10 compute-2 sudo[76985]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-2 sudo[76985]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-2 sudo[77010]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Jan 23 09:51:10 compute-2 sudo[77010]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-2 sudo[77010]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-2 sudo[77035]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:51:10 compute-2 sudo[77035]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-2 sudo[77035]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-2 sudo[77060]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:51:10 compute-2 ceph-mgr[76120]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 09:51:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:10.560+0000 7f3a29999140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 09:51:10 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'zabbix'
Jan 23 09:51:10 compute-2 sudo[77060]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-2 sudo[77060]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-2 sudo[77085]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:51:10 compute-2 sudo[77085]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-2 sudo[77085]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-2 ceph-mgr[76120]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 09:51:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:10.646+0000 7f3a29999140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 09:51:10 compute-2 ceph-mgr[76120]: ms_deliver_dispatch: unhandled message 0x55e162952d00 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Jan 23 09:51:10 compute-2 ceph-mgr[76120]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 23 09:51:10 compute-2 ceph-mgr[76120]: mgr load Constructed class from module: dashboard
Jan 23 09:51:10 compute-2 ceph-mgr[76120]: [dashboard INFO root] server: ssl=no host=:: port=8443
Jan 23 09:51:10 compute-2 ceph-mgr[76120]: [dashboard INFO root] Configured CherryPy, starting engine...
Jan 23 09:51:10 compute-2 ceph-mgr[76120]: [dashboard INFO root] Starting engine...
Jan 23 09:51:10 compute-2 sudo[77110]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:51:10 compute-2 sudo[77110]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-2 sudo[77110]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-2 sudo[77147]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:51:10 compute-2 sudo[77147]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-2 sudo[77147]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-2 ceph-mgr[76120]: [dashboard INFO root] Engine started...
Jan 23 09:51:10 compute-2 sudo[77195]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:51:10 compute-2 sudo[77195]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-2 sudo[77195]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-2 sudo[77220]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:51:10 compute-2 sudo[77220]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-2 sudo[77220]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-2 sudo[77245]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:51:10 compute-2 sudo[77245]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-2 sudo[77245]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:11 compute-2 sudo[77270]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Jan 23 09:51:11 compute-2 sudo[77270]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:11 compute-2 sudo[77270]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:11 compute-2 sudo[77295]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph
Jan 23 09:51:11 compute-2 sudo[77295]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:11 compute-2 sudo[77295]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:11 compute-2 sudo[77320]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:51:11 compute-2 sudo[77320]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:11 compute-2 sudo[77320]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:11 compute-2 sudo[77345]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:51:11 compute-2 sudo[77345]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:11 compute-2 sudo[77345]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:11 compute-2 sudo[77370]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:51:11 compute-2 sudo[77370]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:11 compute-2 sudo[77370]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:11 compute-2 sudo[77418]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:51:11 compute-2 sudo[77418]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:11 compute-2 sudo[77418]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:11 compute-2 sudo[77443]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:51:11 compute-2 sudo[77443]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:11 compute-2 sudo[77443]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:11 compute-2 sudo[77468]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Jan 23 09:51:11 compute-2 sudo[77468]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:11 compute-2 sudo[77468]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:11 compute-2 sudo[77493]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:51:11 compute-2 sudo[77493]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:11 compute-2 sudo[77493]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:11 compute-2 sudo[77518]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:51:11 compute-2 sudo[77518]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:11 compute-2 sudo[77518]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:11 compute-2 sudo[77543]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:51:11 compute-2 sudo[77543]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:11 compute-2 sudo[77543]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:11 compute-2 sudo[77568]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:51:11 compute-2 sudo[77568]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:11 compute-2 sudo[77568]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:11 compute-2 sudo[77593]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:51:11 compute-2 sudo[77593]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:11 compute-2 sudo[77593]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:11 compute-2 sudo[77641]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:51:11 compute-2 sudo[77641]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:11 compute-2 sudo[77641]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:11 compute-2 sudo[77666]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:51:11 compute-2 sudo[77666]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:11 compute-2 sudo[77666]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:12 compute-2 sudo[77691]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:51:12 compute-2 sudo[77691]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:12 compute-2 sudo[77691]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:13 compute-2 ceph-mon[75771]: 5.b scrub starts
Jan 23 09:51:13 compute-2 ceph-mon[75771]: 5.b scrub ok
Jan 23 09:51:13 compute-2 ceph-mon[75771]: 5.e scrub starts
Jan 23 09:51:13 compute-2 ceph-mon[75771]: 5.e scrub ok
Jan 23 09:51:13 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:13 compute-2 ceph-mon[75771]: mgrmap e15: compute-0.nbdygh(active, since 4s), standbys: compute-2.uczrot
Jan 23 09:51:13 compute-2 ceph-mon[75771]: Standby manager daemon compute-1.jmakme started
Jan 23 09:51:13 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:13 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:13 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:13 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 09:51:13 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:13 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Jan 23 09:51:13 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:13 compute-2 ceph-mon[75771]: Adjusting osd_memory_target on compute-0 to 127.9M
Jan 23 09:51:13 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:13 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Jan 23 09:51:13 compute-2 ceph-mon[75771]: Unable to set osd_memory_target on compute-0 to 134214860: error parsing value: Value '134214860' is below minimum 939524096
Jan 23 09:51:13 compute-2 ceph-mon[75771]: Adjusting osd_memory_target on compute-1 to 127.9M
Jan 23 09:51:13 compute-2 ceph-mon[75771]: Unable to set osd_memory_target on compute-1 to 134214860: error parsing value: Value '134214860' is below minimum 939524096
Jan 23 09:51:13 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:51:13 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 09:51:13 compute-2 ceph-mon[75771]: Updating compute-0:/etc/ceph/ceph.conf
Jan 23 09:51:13 compute-2 ceph-mon[75771]: Updating compute-1:/etc/ceph/ceph.conf
Jan 23 09:51:13 compute-2 ceph-mon[75771]: Updating compute-2:/etc/ceph/ceph.conf
Jan 23 09:51:13 compute-2 ceph-mon[75771]: pgmap v5: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:51:13 compute-2 ceph-mon[75771]: 3.1a scrub starts
Jan 23 09:51:13 compute-2 ceph-mon[75771]: 3.1a scrub ok
Jan 23 09:51:13 compute-2 ceph-mon[75771]: Standby manager daemon compute-2.uczrot restarted
Jan 23 09:51:13 compute-2 ceph-mon[75771]: Standby manager daemon compute-2.uczrot started
Jan 23 09:51:14 compute-2 ceph-mon[75771]: 5.8 deep-scrub starts
Jan 23 09:51:14 compute-2 ceph-mon[75771]: 5.8 deep-scrub ok
Jan 23 09:51:14 compute-2 ceph-mon[75771]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:51:14 compute-2 ceph-mon[75771]: Updating compute-0:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:51:14 compute-2 ceph-mon[75771]: Updating compute-1:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:51:14 compute-2 ceph-mon[75771]: from='client.14310 -' entity='client.admin' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://192.168.122.100:9093", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 09:51:14 compute-2 ceph-mon[75771]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 23 09:51:14 compute-2 ceph-mon[75771]: 6.b scrub starts
Jan 23 09:51:14 compute-2 ceph-mon[75771]: 6.b scrub ok
Jan 23 09:51:14 compute-2 ceph-mon[75771]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Jan 23 09:51:14 compute-2 ceph-mon[75771]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Jan 23 09:51:14 compute-2 ceph-mon[75771]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:51:14 compute-2 ceph-mon[75771]: 4.1 scrub starts
Jan 23 09:51:14 compute-2 ceph-mon[75771]: 4.1 scrub ok
Jan 23 09:51:14 compute-2 ceph-mon[75771]: Updating compute-0:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:51:14 compute-2 ceph-mon[75771]: pgmap v6: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:51:14 compute-2 ceph-mon[75771]: 4.17 scrub starts
Jan 23 09:51:14 compute-2 ceph-mon[75771]: 4.17 scrub ok
Jan 23 09:51:14 compute-2 ceph-mon[75771]: Updating compute-1:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:51:14 compute-2 ceph-mon[75771]: 6.3 scrub starts
Jan 23 09:51:14 compute-2 ceph-mon[75771]: 6.3 scrub ok
Jan 23 09:51:14 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:14 compute-2 ceph-mon[75771]: 4.16 scrub starts
Jan 23 09:51:14 compute-2 ceph-mon[75771]: 4.16 scrub ok
Jan 23 09:51:14 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:14 compute-2 ceph-mon[75771]: mgrmap e16: compute-0.nbdygh(active, since 7s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:51:14 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr metadata", "who": "compute-1.jmakme", "id": "compute-1.jmakme"}]: dispatch
Jan 23 09:51:14 compute-2 ceph-mon[75771]: 6.2 scrub starts
Jan 23 09:51:14 compute-2 ceph-mon[75771]: 6.2 scrub ok
Jan 23 09:51:14 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:14 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:14 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:14 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:14 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:14 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:14 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:15 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e32 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:51:15 compute-2 ceph-mon[75771]: Deploying daemon node-exporter.compute-0 on compute-0
Jan 23 09:51:15 compute-2 ceph-mon[75771]: 5.17 scrub starts
Jan 23 09:51:15 compute-2 ceph-mon[75771]: 5.17 scrub ok
Jan 23 09:51:15 compute-2 ceph-mon[75771]: pgmap v7: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail; 28 KiB/s rd, 0 B/s wr, 11 op/s
Jan 23 09:51:15 compute-2 ceph-mon[75771]: from='client.14316 -' entity='client.admin' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://192.168.122.100:9092", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 09:51:15 compute-2 ceph-mon[75771]: 6.5 deep-scrub starts
Jan 23 09:51:15 compute-2 ceph-mon[75771]: 6.5 deep-scrub ok
Jan 23 09:51:16 compute-2 ceph-mon[75771]: 6.14 scrub starts
Jan 23 09:51:16 compute-2 ceph-mon[75771]: 6.14 scrub ok
Jan 23 09:51:16 compute-2 ceph-mon[75771]: from='client.14322 -' entity='client.admin' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "http://192.168.122.100:3100", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 09:51:16 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:16 compute-2 ceph-mon[75771]: 5.1 scrub starts
Jan 23 09:51:16 compute-2 ceph-mon[75771]: 5.1 scrub ok
Jan 23 09:51:16 compute-2 ceph-mon[75771]: pgmap v8: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail; 21 KiB/s rd, 0 B/s wr, 8 op/s
Jan 23 09:51:17 compute-2 ceph-mon[75771]: 5.14 scrub starts
Jan 23 09:51:17 compute-2 ceph-mon[75771]: 5.14 scrub ok
Jan 23 09:51:17 compute-2 ceph-mon[75771]: 3.5 scrub starts
Jan 23 09:51:17 compute-2 ceph-mon[75771]: 3.5 scrub ok
Jan 23 09:51:17 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:17 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:17 compute-2 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:17 compute-2 ceph-mon[75771]: Deploying daemon node-exporter.compute-1 on compute-1
Jan 23 09:51:17 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1110789864' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Jan 23 09:51:17 compute-2 ceph-mgr[76120]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 23 09:51:17 compute-2 ceph-mgr[76120]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 23 09:51:17 compute-2 ceph-mgr[76120]: mgr respawn  0: '/usr/bin/ceph-mgr'
Jan 23 09:51:17 compute-2 ceph-mgr[76120]: mgr respawn  1: '-n'
Jan 23 09:51:17 compute-2 ceph-mgr[76120]: mgr respawn  2: 'mgr.compute-2.uczrot'
Jan 23 09:51:17 compute-2 ceph-mgr[76120]: mgr respawn  3: '-f'
Jan 23 09:51:17 compute-2 ceph-mgr[76120]: mgr respawn  4: '--setuser'
Jan 23 09:51:17 compute-2 ceph-mgr[76120]: mgr respawn  5: 'ceph'
Jan 23 09:51:17 compute-2 ceph-mgr[76120]: mgr respawn  6: '--setgroup'
Jan 23 09:51:17 compute-2 ceph-mgr[76120]: mgr respawn  7: 'ceph'
Jan 23 09:51:17 compute-2 ceph-mgr[76120]: mgr respawn  8: '--default-log-to-file=false'
Jan 23 09:51:17 compute-2 ceph-mgr[76120]: mgr respawn  9: '--default-log-to-journald=true'
Jan 23 09:51:17 compute-2 ceph-mgr[76120]: mgr respawn  10: '--default-log-to-stderr=false'
Jan 23 09:51:17 compute-2 ceph-mgr[76120]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Jan 23 09:51:17 compute-2 ceph-mgr[76120]: mgr respawn  exe_path /proc/self/exe
Jan 23 09:51:17 compute-2 sshd-session[76473]: Connection closed by 192.168.122.100 port 56054
Jan 23 09:51:17 compute-2 sshd-session[76470]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:51:17 compute-2 systemd[1]: session-33.scope: Deactivated successfully.
Jan 23 09:51:17 compute-2 systemd[1]: session-33.scope: Consumed 4.234s CPU time.
Jan 23 09:51:17 compute-2 systemd-logind[786]: Session 33 logged out. Waiting for processes to exit.
Jan 23 09:51:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: ignoring --setuser ceph since I am not root
Jan 23 09:51:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: ignoring --setgroup ceph since I am not root
Jan 23 09:51:17 compute-2 systemd-logind[786]: Removed session 33.
Jan 23 09:51:17 compute-2 ceph-mgr[76120]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 23 09:51:17 compute-2 ceph-mgr[76120]: pidfile_write: ignore empty --pid-file
Jan 23 09:51:17 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'alerts'
Jan 23 09:51:18 compute-2 ceph-mgr[76120]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 09:51:18 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'balancer'
Jan 23 09:51:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:18.006+0000 7fc4a9399140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 09:51:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:18.098+0000 7fc4a9399140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 09:51:18 compute-2 ceph-mgr[76120]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 09:51:18 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'cephadm'
Jan 23 09:51:18 compute-2 ceph-mon[75771]: 6.16 deep-scrub starts
Jan 23 09:51:18 compute-2 ceph-mon[75771]: 6.16 deep-scrub ok
Jan 23 09:51:18 compute-2 ceph-mon[75771]: 5.2 scrub starts
Jan 23 09:51:18 compute-2 ceph-mon[75771]: 5.2 scrub ok
Jan 23 09:51:18 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1110789864' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Jan 23 09:51:18 compute-2 ceph-mon[75771]: mgrmap e17: compute-0.nbdygh(active, since 12s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:51:18 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/435334493' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Jan 23 09:51:18 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'crash'
Jan 23 09:51:19 compute-2 ceph-mgr[76120]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 09:51:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:19.037+0000 7fc4a9399140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 09:51:19 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'dashboard'
Jan 23 09:51:19 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'devicehealth'
Jan 23 09:51:19 compute-2 ceph-mgr[76120]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 09:51:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:19.734+0000 7fc4a9399140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 09:51:19 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'diskprediction_local'
Jan 23 09:51:19 compute-2 ceph-mon[75771]: 5.12 scrub starts
Jan 23 09:51:19 compute-2 ceph-mon[75771]: 5.12 scrub ok
Jan 23 09:51:19 compute-2 ceph-mon[75771]: 3.3 scrub starts
Jan 23 09:51:19 compute-2 ceph-mon[75771]: 3.3 scrub ok
Jan 23 09:51:19 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/435334493' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Jan 23 09:51:19 compute-2 ceph-mon[75771]: mgrmap e18: compute-0.nbdygh(active, since 13s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:51:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 23 09:51:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 23 09:51:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]:   from numpy import show_config as show_numpy_config
Jan 23 09:51:19 compute-2 ceph-mgr[76120]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 09:51:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:19.932+0000 7fc4a9399140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 09:51:19 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'influx'
Jan 23 09:51:20 compute-2 ceph-mgr[76120]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 09:51:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:20.019+0000 7fc4a9399140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 09:51:20 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'insights'
Jan 23 09:51:20 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'iostat'
Jan 23 09:51:20 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e32 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:51:20 compute-2 ceph-mgr[76120]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 09:51:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:20.170+0000 7fc4a9399140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 09:51:20 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'k8sevents'
Jan 23 09:51:20 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'localpool'
Jan 23 09:51:20 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'mds_autoscaler'
Jan 23 09:51:20 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'mirroring'
Jan 23 09:51:21 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'nfs'
Jan 23 09:51:21 compute-2 ceph-mon[75771]: 6.11 deep-scrub starts
Jan 23 09:51:21 compute-2 ceph-mon[75771]: 6.11 deep-scrub ok
Jan 23 09:51:21 compute-2 ceph-mon[75771]: 4.5 scrub starts
Jan 23 09:51:21 compute-2 ceph-mon[75771]: 4.5 scrub ok
Jan 23 09:51:21 compute-2 ceph-mon[75771]: 5.4 deep-scrub starts
Jan 23 09:51:21 compute-2 ceph-mon[75771]: 5.4 deep-scrub ok
Jan 23 09:51:21 compute-2 ceph-mgr[76120]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 09:51:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:21.331+0000 7fc4a9399140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 09:51:21 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'orchestrator'
Jan 23 09:51:21 compute-2 ceph-mgr[76120]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:21.579+0000 7fc4a9399140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:21 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'osd_perf_query'
Jan 23 09:51:21 compute-2 ceph-mgr[76120]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 09:51:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:21.658+0000 7fc4a9399140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 09:51:21 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'osd_support'
Jan 23 09:51:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:21.730+0000 7fc4a9399140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 09:51:21 compute-2 ceph-mgr[76120]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 09:51:21 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'pg_autoscaler'
Jan 23 09:51:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:21.822+0000 7fc4a9399140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 09:51:21 compute-2 ceph-mgr[76120]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 09:51:21 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'progress'
Jan 23 09:51:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:21.900+0000 7fc4a9399140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 09:51:21 compute-2 ceph-mgr[76120]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 09:51:21 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'prometheus'
Jan 23 09:51:22 compute-2 ceph-mon[75771]: 5.13 scrub starts
Jan 23 09:51:22 compute-2 ceph-mon[75771]: 5.13 scrub ok
Jan 23 09:51:22 compute-2 ceph-mon[75771]: 6.10 scrub starts
Jan 23 09:51:22 compute-2 ceph-mon[75771]: 6.10 scrub ok
Jan 23 09:51:22 compute-2 ceph-mon[75771]: 3.9 scrub starts
Jan 23 09:51:22 compute-2 ceph-mon[75771]: 3.9 scrub ok
Jan 23 09:51:22 compute-2 ceph-mgr[76120]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 09:51:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:22.289+0000 7fc4a9399140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 09:51:22 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'rbd_support'
Jan 23 09:51:22 compute-2 ceph-mgr[76120]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 09:51:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:22.416+0000 7fc4a9399140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 09:51:22 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'restful'
Jan 23 09:51:22 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'rgw'
Jan 23 09:51:22 compute-2 ceph-mgr[76120]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 09:51:22 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'rook'
Jan 23 09:51:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:22.933+0000 7fc4a9399140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 09:51:23 compute-2 ceph-mon[75771]: 6.13 scrub starts
Jan 23 09:51:23 compute-2 ceph-mon[75771]: 6.13 scrub ok
Jan 23 09:51:23 compute-2 ceph-mon[75771]: 5.7 scrub starts
Jan 23 09:51:23 compute-2 ceph-mon[75771]: 5.7 scrub ok
Jan 23 09:51:23 compute-2 ceph-mgr[76120]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 09:51:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:23.562+0000 7fc4a9399140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 09:51:23 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'selftest'
Jan 23 09:51:23 compute-2 ceph-mgr[76120]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 09:51:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:23.637+0000 7fc4a9399140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 09:51:23 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'snap_schedule'
Jan 23 09:51:23 compute-2 ceph-mgr[76120]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 09:51:23 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'stats'
Jan 23 09:51:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:23.725+0000 7fc4a9399140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 09:51:23 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'status'
Jan 23 09:51:23 compute-2 ceph-mgr[76120]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 09:51:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:23.888+0000 7fc4a9399140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 09:51:23 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'telegraf'
Jan 23 09:51:23 compute-2 ceph-mgr[76120]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 09:51:23 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'telemetry'
Jan 23 09:51:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:23.974+0000 7fc4a9399140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 09:51:24 compute-2 ceph-mgr[76120]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 09:51:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:24.153+0000 7fc4a9399140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 09:51:24 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'test_orchestrator'
Jan 23 09:51:24 compute-2 ceph-mon[75771]: 5.1e scrub starts
Jan 23 09:51:24 compute-2 ceph-mon[75771]: 5.1e scrub ok
Jan 23 09:51:24 compute-2 ceph-mon[75771]: 6.8 scrub starts
Jan 23 09:51:24 compute-2 ceph-mon[75771]: 6.8 scrub ok
Jan 23 09:51:24 compute-2 ceph-mgr[76120]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:24 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'volumes'
Jan 23 09:51:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:24.421+0000 7fc4a9399140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:24.725+0000 7fc4a9399140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 09:51:24 compute-2 ceph-mgr[76120]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 09:51:24 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'zabbix'
Jan 23 09:51:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:24.804+0000 7fc4a9399140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 09:51:24 compute-2 ceph-mgr[76120]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 09:51:24 compute-2 ceph-mgr[76120]: ms_deliver_dispatch: unhandled message 0x55caeacef860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Jan 23 09:51:24 compute-2 ceph-mgr[76120]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 23 09:51:24 compute-2 ceph-mgr[76120]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 23 09:51:24 compute-2 ceph-mgr[76120]: mgr respawn  0: '/usr/bin/ceph-mgr'
Jan 23 09:51:24 compute-2 ceph-mgr[76120]: mgr respawn  1: '-n'
Jan 23 09:51:24 compute-2 ceph-mgr[76120]: mgr respawn  2: 'mgr.compute-2.uczrot'
Jan 23 09:51:24 compute-2 ceph-mgr[76120]: mgr respawn  3: '-f'
Jan 23 09:51:24 compute-2 ceph-mgr[76120]: mgr respawn  4: '--setuser'
Jan 23 09:51:24 compute-2 ceph-mgr[76120]: mgr respawn  5: 'ceph'
Jan 23 09:51:24 compute-2 ceph-mgr[76120]: mgr respawn  6: '--setgroup'
Jan 23 09:51:24 compute-2 ceph-mgr[76120]: mgr respawn  7: 'ceph'
Jan 23 09:51:24 compute-2 ceph-mgr[76120]: mgr respawn  8: '--default-log-to-file=false'
Jan 23 09:51:24 compute-2 ceph-mgr[76120]: mgr respawn  9: '--default-log-to-journald=true'
Jan 23 09:51:24 compute-2 ceph-mgr[76120]: mgr respawn  10: '--default-log-to-stderr=false'
Jan 23 09:51:24 compute-2 ceph-mgr[76120]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Jan 23 09:51:24 compute-2 ceph-mgr[76120]: mgr respawn  exe_path /proc/self/exe
Jan 23 09:51:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: ignoring --setuser ceph since I am not root
Jan 23 09:51:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: ignoring --setgroup ceph since I am not root
Jan 23 09:51:24 compute-2 ceph-mgr[76120]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 23 09:51:24 compute-2 ceph-mgr[76120]: pidfile_write: ignore empty --pid-file
Jan 23 09:51:24 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'alerts'
Jan 23 09:51:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:25.081+0000 7f2757a28140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 09:51:25 compute-2 ceph-mgr[76120]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 09:51:25 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'balancer'
Jan 23 09:51:25 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e32 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:51:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:25.189+0000 7f2757a28140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 09:51:25 compute-2 ceph-mgr[76120]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 09:51:25 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'cephadm'
Jan 23 09:51:25 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e33 e33: 2 total, 2 up, 2 in
Jan 23 09:51:25 compute-2 ceph-mon[75771]: 6.1d scrub starts
Jan 23 09:51:25 compute-2 ceph-mon[75771]: 6.1d scrub ok
Jan 23 09:51:25 compute-2 ceph-mon[75771]: Standby manager daemon compute-1.jmakme restarted
Jan 23 09:51:25 compute-2 ceph-mon[75771]: Standby manager daemon compute-1.jmakme started
Jan 23 09:51:25 compute-2 ceph-mon[75771]: 4.a scrub starts
Jan 23 09:51:25 compute-2 ceph-mon[75771]: 4.a scrub ok
Jan 23 09:51:25 compute-2 ceph-mon[75771]: Standby manager daemon compute-2.uczrot restarted
Jan 23 09:51:25 compute-2 ceph-mon[75771]: Standby manager daemon compute-2.uczrot started
Jan 23 09:51:25 compute-2 ceph-mon[75771]: Active manager daemon compute-0.nbdygh restarted
Jan 23 09:51:25 compute-2 ceph-mon[75771]: Activating manager daemon compute-0.nbdygh
Jan 23 09:51:25 compute-2 ceph-mon[75771]: osdmap e33: 2 total, 2 up, 2 in
Jan 23 09:51:25 compute-2 ceph-mon[75771]: mgrmap e19: compute-0.nbdygh(active, starting, since 0.060897s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:51:26 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'crash'
Jan 23 09:51:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:26.189+0000 7f2757a28140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 09:51:26 compute-2 ceph-mgr[76120]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 09:51:26 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'dashboard'
Jan 23 09:51:26 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'devicehealth'
Jan 23 09:51:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:26.932+0000 7f2757a28140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 09:51:26 compute-2 ceph-mgr[76120]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 09:51:26 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'diskprediction_local'
Jan 23 09:51:26 compute-2 ceph-mon[75771]: 7.1b scrub starts
Jan 23 09:51:26 compute-2 ceph-mon[75771]: 7.1b scrub ok
Jan 23 09:51:26 compute-2 ceph-mon[75771]: 6.7 scrub starts
Jan 23 09:51:26 compute-2 ceph-mon[75771]: 6.7 scrub ok
Jan 23 09:51:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 23 09:51:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 23 09:51:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]:   from numpy import show_config as show_numpy_config
Jan 23 09:51:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:27.158+0000 7f2757a28140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 09:51:27 compute-2 ceph-mgr[76120]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 09:51:27 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'influx'
Jan 23 09:51:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:27.251+0000 7f2757a28140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 09:51:27 compute-2 ceph-mgr[76120]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 09:51:27 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'insights'
Jan 23 09:51:27 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'iostat'
Jan 23 09:51:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:27.414+0000 7f2757a28140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 09:51:27 compute-2 ceph-mgr[76120]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 09:51:27 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'k8sevents'
Jan 23 09:51:27 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'localpool'
Jan 23 09:51:27 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'mds_autoscaler'
Jan 23 09:51:27 compute-2 ceph-mon[75771]: 7.18 scrub starts
Jan 23 09:51:27 compute-2 ceph-mon[75771]: 7.18 scrub ok
Jan 23 09:51:27 compute-2 ceph-mon[75771]: 4.d scrub starts
Jan 23 09:51:27 compute-2 ceph-mon[75771]: 4.d scrub ok
Jan 23 09:51:27 compute-2 ceph-mon[75771]: 2.1b scrub starts
Jan 23 09:51:27 compute-2 ceph-mon[75771]: 2.1b scrub ok
Jan 23 09:51:27 compute-2 ceph-mon[75771]: 3.c scrub starts
Jan 23 09:51:27 compute-2 ceph-mon[75771]: 3.c scrub ok
Jan 23 09:51:28 compute-2 systemd[1]: Stopping User Manager for UID 42477...
Jan 23 09:51:28 compute-2 systemd[72559]: Activating special unit Exit the Session...
Jan 23 09:51:28 compute-2 systemd[72559]: Stopped target Main User Target.
Jan 23 09:51:28 compute-2 systemd[72559]: Stopped target Basic System.
Jan 23 09:51:28 compute-2 systemd[72559]: Stopped target Paths.
Jan 23 09:51:28 compute-2 systemd[72559]: Stopped target Sockets.
Jan 23 09:51:28 compute-2 systemd[72559]: Stopped target Timers.
Jan 23 09:51:28 compute-2 systemd[72559]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:51:28 compute-2 systemd[72559]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 09:51:28 compute-2 systemd[72559]: Closed D-Bus User Message Bus Socket.
Jan 23 09:51:28 compute-2 systemd[72559]: Stopped Create User's Volatile Files and Directories.
Jan 23 09:51:28 compute-2 systemd[72559]: Removed slice User Application Slice.
Jan 23 09:51:28 compute-2 systemd[72559]: Reached target Shutdown.
Jan 23 09:51:28 compute-2 systemd[72559]: Finished Exit the Session.
Jan 23 09:51:28 compute-2 systemd[72559]: Reached target Exit the Session.
Jan 23 09:51:28 compute-2 systemd[1]: user@42477.service: Deactivated successfully.
Jan 23 09:51:28 compute-2 systemd[1]: Stopped User Manager for UID 42477.
Jan 23 09:51:28 compute-2 systemd[1]: Stopping User Runtime Directory /run/user/42477...
Jan 23 09:51:28 compute-2 systemd[1]: run-user-42477.mount: Deactivated successfully.
Jan 23 09:51:28 compute-2 systemd[1]: user-runtime-dir@42477.service: Deactivated successfully.
Jan 23 09:51:28 compute-2 systemd[1]: Stopped User Runtime Directory /run/user/42477.
Jan 23 09:51:28 compute-2 systemd[1]: Removed slice User Slice of UID 42477.
Jan 23 09:51:28 compute-2 systemd[1]: user-42477.slice: Consumed 1min 28.092s CPU time.
Jan 23 09:51:28 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'mirroring'
Jan 23 09:51:28 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'nfs'
Jan 23 09:51:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:28.602+0000 7f2757a28140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 09:51:28 compute-2 ceph-mgr[76120]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 09:51:28 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'orchestrator'
Jan 23 09:51:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:28.874+0000 7f2757a28140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:28 compute-2 ceph-mgr[76120]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:28 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'osd_perf_query'
Jan 23 09:51:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:28.973+0000 7f2757a28140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 09:51:28 compute-2 ceph-mgr[76120]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 09:51:28 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'osd_support'
Jan 23 09:51:29 compute-2 ceph-mon[75771]: 7.6 scrub starts
Jan 23 09:51:29 compute-2 ceph-mon[75771]: 7.6 scrub ok
Jan 23 09:51:29 compute-2 ceph-mon[75771]: 3.d deep-scrub starts
Jan 23 09:51:29 compute-2 ceph-mon[75771]: 3.d deep-scrub ok
Jan 23 09:51:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:29.069+0000 7f2757a28140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 09:51:29 compute-2 ceph-mgr[76120]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 09:51:29 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'pg_autoscaler'
Jan 23 09:51:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:29.152+0000 7f2757a28140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 09:51:29 compute-2 ceph-mgr[76120]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 09:51:29 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'progress'
Jan 23 09:51:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:29.235+0000 7f2757a28140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 09:51:29 compute-2 ceph-mgr[76120]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 09:51:29 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'prometheus'
Jan 23 09:51:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:29.605+0000 7f2757a28140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 09:51:29 compute-2 ceph-mgr[76120]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 09:51:29 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'rbd_support'
Jan 23 09:51:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:29.719+0000 7f2757a28140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 09:51:29 compute-2 ceph-mgr[76120]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 09:51:29 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'restful'
Jan 23 09:51:29 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'rgw'
Jan 23 09:51:30 compute-2 ceph-mon[75771]: 7.1e deep-scrub starts
Jan 23 09:51:30 compute-2 ceph-mon[75771]: 7.1e deep-scrub ok
Jan 23 09:51:30 compute-2 ceph-mon[75771]: 5.9 scrub starts
Jan 23 09:51:30 compute-2 ceph-mon[75771]: 5.9 scrub ok
Jan 23 09:51:30 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e33 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:51:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:30.258+0000 7f2757a28140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 09:51:30 compute-2 ceph-mgr[76120]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 09:51:30 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'rook'
Jan 23 09:51:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:30.943+0000 7f2757a28140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 09:51:30 compute-2 ceph-mgr[76120]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 09:51:30 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'selftest'
Jan 23 09:51:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:31.029+0000 7f2757a28140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 09:51:31 compute-2 ceph-mgr[76120]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 09:51:31 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'snap_schedule'
Jan 23 09:51:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:31.133+0000 7f2757a28140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 09:51:31 compute-2 ceph-mgr[76120]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 09:51:31 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'stats'
Jan 23 09:51:31 compute-2 ceph-mon[75771]: 7.2 scrub starts
Jan 23 09:51:31 compute-2 ceph-mon[75771]: 7.2 scrub ok
Jan 23 09:51:31 compute-2 ceph-mon[75771]: Standby manager daemon compute-1.jmakme restarted
Jan 23 09:51:31 compute-2 ceph-mon[75771]: Standby manager daemon compute-1.jmakme started
Jan 23 09:51:31 compute-2 ceph-mon[75771]: 4.8 scrub starts
Jan 23 09:51:31 compute-2 ceph-mon[75771]: 4.8 scrub ok
Jan 23 09:51:31 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'status'
Jan 23 09:51:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:31.315+0000 7f2757a28140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 09:51:31 compute-2 ceph-mgr[76120]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 09:51:31 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'telegraf'
Jan 23 09:51:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:31.394+0000 7f2757a28140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 09:51:31 compute-2 ceph-mgr[76120]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 09:51:31 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'telemetry'
Jan 23 09:51:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:31.571+0000 7f2757a28140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 09:51:31 compute-2 ceph-mgr[76120]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 09:51:31 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'test_orchestrator'
Jan 23 09:51:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:31.815+0000 7f2757a28140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:31 compute-2 ceph-mgr[76120]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:31 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'volumes'
Jan 23 09:51:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:32.129+0000 7f2757a28140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 09:51:32 compute-2 ceph-mgr[76120]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 09:51:32 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'zabbix'
Jan 23 09:51:32 compute-2 ceph-mon[75771]: 7.3 scrub starts
Jan 23 09:51:32 compute-2 ceph-mon[75771]: mgrmap e20: compute-0.nbdygh(active, starting, since 6s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:51:32 compute-2 ceph-mon[75771]: 7.3 scrub ok
Jan 23 09:51:32 compute-2 ceph-mon[75771]: 6.a deep-scrub starts
Jan 23 09:51:32 compute-2 ceph-mon[75771]: 6.a deep-scrub ok
Jan 23 09:51:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:32.223+0000 7f2757a28140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 09:51:32 compute-2 ceph-mgr[76120]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 09:51:32 compute-2 ceph-mgr[76120]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 23 09:51:32 compute-2 ceph-mgr[76120]: mgr load Constructed class from module: dashboard
Jan 23 09:51:32 compute-2 ceph-mgr[76120]: [dashboard INFO root] server: ssl=no host=:: port=8443
Jan 23 09:51:32 compute-2 ceph-mgr[76120]: [dashboard INFO root] Configured CherryPy, starting engine...
Jan 23 09:51:32 compute-2 ceph-mgr[76120]: [dashboard INFO root] Starting engine...
Jan 23 09:51:32 compute-2 ceph-mgr[76120]: ms_deliver_dispatch: unhandled message 0x55cfcdde1860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Jan 23 09:51:32 compute-2 ceph-mgr[76120]: [dashboard INFO root] Engine started...
Jan 23 09:51:32 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e34 e34: 2 total, 2 up, 2 in
Jan 23 09:51:33 compute-2 ceph-mon[75771]: 7.e scrub starts
Jan 23 09:51:33 compute-2 ceph-mon[75771]: 7.e scrub ok
Jan 23 09:51:33 compute-2 ceph-mon[75771]: Standby manager daemon compute-2.uczrot restarted
Jan 23 09:51:33 compute-2 ceph-mon[75771]: Standby manager daemon compute-2.uczrot started
Jan 23 09:51:33 compute-2 ceph-mon[75771]: 3.a scrub starts
Jan 23 09:51:33 compute-2 ceph-mon[75771]: 3.a scrub ok
Jan 23 09:51:33 compute-2 ceph-mon[75771]: Active manager daemon compute-0.nbdygh restarted
Jan 23 09:51:33 compute-2 ceph-mon[75771]: Activating manager daemon compute-0.nbdygh
Jan 23 09:51:33 compute-2 ceph-mon[75771]: osdmap e34: 2 total, 2 up, 2 in
Jan 23 09:51:33 compute-2 ceph-mon[75771]: mgrmap e21: compute-0.nbdygh(active, starting, since 0.0330109s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:51:33 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Jan 23 09:51:33 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 23 09:51:33 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 23 09:51:33 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr metadata", "who": "compute-0.nbdygh", "id": "compute-0.nbdygh"}]: dispatch
Jan 23 09:51:33 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr metadata", "who": "compute-2.uczrot", "id": "compute-2.uczrot"}]: dispatch
Jan 23 09:51:33 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr metadata", "who": "compute-1.jmakme", "id": "compute-1.jmakme"}]: dispatch
Jan 23 09:51:33 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Jan 23 09:51:33 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Jan 23 09:51:33 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mds metadata"}]: dispatch
Jan 23 09:51:33 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 09:51:33 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata"}]: dispatch
Jan 23 09:51:33 compute-2 ceph-mon[75771]: Manager daemon compute-0.nbdygh is now available
Jan 23 09:51:33 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.nbdygh/mirror_snapshot_schedule"}]: dispatch
Jan 23 09:51:33 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.nbdygh/trash_purge_schedule"}]: dispatch
Jan 23 09:51:33 compute-2 sshd-session[77792]: Accepted publickey for ceph-admin from 192.168.122.100 port 35598 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:51:33 compute-2 systemd[1]: Created slice User Slice of UID 42477.
Jan 23 09:51:33 compute-2 systemd[1]: Starting User Runtime Directory /run/user/42477...
Jan 23 09:51:33 compute-2 systemd-logind[786]: New session 34 of user ceph-admin.
Jan 23 09:51:33 compute-2 systemd[1]: Finished User Runtime Directory /run/user/42477.
Jan 23 09:51:33 compute-2 systemd[1]: Starting User Manager for UID 42477...
Jan 23 09:51:33 compute-2 systemd[77796]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:51:33 compute-2 systemd[77796]: Queued start job for default target Main User Target.
Jan 23 09:51:33 compute-2 systemd[77796]: Created slice User Application Slice.
Jan 23 09:51:33 compute-2 systemd[77796]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:51:33 compute-2 systemd[77796]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 09:51:33 compute-2 systemd[77796]: Reached target Paths.
Jan 23 09:51:33 compute-2 systemd[77796]: Reached target Timers.
Jan 23 09:51:33 compute-2 systemd[77796]: Starting D-Bus User Message Bus Socket...
Jan 23 09:51:33 compute-2 systemd[77796]: Starting Create User's Volatile Files and Directories...
Jan 23 09:51:33 compute-2 systemd[77796]: Listening on D-Bus User Message Bus Socket.
Jan 23 09:51:33 compute-2 systemd[77796]: Reached target Sockets.
Jan 23 09:51:33 compute-2 systemd[77796]: Finished Create User's Volatile Files and Directories.
Jan 23 09:51:33 compute-2 systemd[77796]: Reached target Basic System.
Jan 23 09:51:33 compute-2 systemd[77796]: Reached target Main User Target.
Jan 23 09:51:33 compute-2 systemd[77796]: Startup finished in 127ms.
Jan 23 09:51:33 compute-2 systemd[1]: Started User Manager for UID 42477.
Jan 23 09:51:33 compute-2 systemd[1]: Started Session 34 of User ceph-admin.
Jan 23 09:51:33 compute-2 sshd-session[77792]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:51:33 compute-2 sudo[77812]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:51:33 compute-2 sudo[77812]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:33 compute-2 sudo[77812]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:33 compute-2 sudo[77837]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 23 09:51:33 compute-2 sudo[77837]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:34 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).mds e2 new map
Jan 23 09:51:34 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).mds e2 print_map
                                           e2
                                           btime 2026-01-23T09:51:34:000852+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        2
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-01-23T09:51:34.000760+0000
                                           modified        2026-01-23T09:51:34.000760+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        
                                           up        {}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 0 members: 
                                            
                                            
Jan 23 09:51:34 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e35 e35: 2 total, 2 up, 2 in
Jan 23 09:51:34 compute-2 ceph-mon[75771]: 7.f deep-scrub starts
Jan 23 09:51:34 compute-2 ceph-mon[75771]: 7.f deep-scrub ok
Jan 23 09:51:34 compute-2 ceph-mon[75771]: 3.e deep-scrub starts
Jan 23 09:51:34 compute-2 ceph-mon[75771]: 3.e deep-scrub ok
Jan 23 09:51:34 compute-2 ceph-mon[75771]: mgrmap e22: compute-0.nbdygh(active, since 1.09481s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:51:34 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Jan 23 09:51:34 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Jan 23 09:51:34 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Jan 23 09:51:34 compute-2 ceph-mon[75771]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Jan 23 09:51:34 compute-2 ceph-mon[75771]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Jan 23 09:51:34 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Jan 23 09:51:34 compute-2 ceph-mon[75771]: osdmap e35: 2 total, 2 up, 2 in
Jan 23 09:51:34 compute-2 ceph-mon[75771]: fsmap cephfs:0
Jan 23 09:51:34 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:34 compute-2 podman[77933]: 2026-01-23 09:51:34.722271397 +0000 UTC m=+0.079571015 container exec 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:51:34 compute-2 podman[77933]: 2026-01-23 09:51:34.829184525 +0000 UTC m=+0.186484133 container exec_died 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 09:51:35 compute-2 sudo[77837]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:35 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e35 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:51:35 compute-2 sudo[78002]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:51:35 compute-2 sudo[78002]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:35 compute-2 sudo[78002]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:35 compute-2 sudo[78027]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 09:51:35 compute-2 sudo[78027]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:35 compute-2 ceph-mon[75771]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 23 09:51:35 compute-2 ceph-mon[75771]: 7.9 scrub starts
Jan 23 09:51:35 compute-2 ceph-mon[75771]: 7.9 scrub ok
Jan 23 09:51:35 compute-2 ceph-mon[75771]: [23/Jan/2026:09:51:34] ENGINE Bus STARTING
Jan 23 09:51:35 compute-2 ceph-mon[75771]: [23/Jan/2026:09:51:34] ENGINE Serving on http://192.168.122.100:8765
Jan 23 09:51:35 compute-2 ceph-mon[75771]: [23/Jan/2026:09:51:34] ENGINE Serving on https://192.168.122.100:7150
Jan 23 09:51:35 compute-2 ceph-mon[75771]: [23/Jan/2026:09:51:34] ENGINE Bus STARTED
Jan 23 09:51:35 compute-2 ceph-mon[75771]: [23/Jan/2026:09:51:34] ENGINE Client ('192.168.122.100', 48072) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Jan 23 09:51:35 compute-2 ceph-mon[75771]: 4.9 scrub starts
Jan 23 09:51:35 compute-2 ceph-mon[75771]: 4.9 scrub ok
Jan 23 09:51:35 compute-2 ceph-mon[75771]: pgmap v5: 193 pgs: 193 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:51:35 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:35 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:35 compute-2 ceph-mon[75771]: from='client.14376 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 09:51:35 compute-2 ceph-mon[75771]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 23 09:51:35 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:35 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:35 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:35 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:35 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:35 compute-2 sudo[78027]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:35 compute-2 sudo[78083]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:51:35 compute-2 sudo[78083]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:35 compute-2 sudo[78083]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:35 compute-2 sudo[78108]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Jan 23 09:51:35 compute-2 sudo[78108]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:36 compute-2 sudo[78108]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:36 compute-2 ceph-mon[75771]: 2.c scrub starts
Jan 23 09:51:36 compute-2 ceph-mon[75771]: 2.c scrub ok
Jan 23 09:51:36 compute-2 ceph-mon[75771]: 3.10 scrub starts
Jan 23 09:51:36 compute-2 ceph-mon[75771]: 3.10 scrub ok
Jan 23 09:51:36 compute-2 ceph-mon[75771]: mgrmap e23: compute-0.nbdygh(active, since 2s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:51:36 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:36 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:36 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 09:51:36 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]: dispatch
Jan 23 09:51:36 compute-2 sudo[78152]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Jan 23 09:51:36 compute-2 sudo[78152]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:36 compute-2 sudo[78152]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-2 sudo[78177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph
Jan 23 09:51:37 compute-2 sudo[78177]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:37 compute-2 sudo[78177]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-2 sudo[78202]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:51:37 compute-2 sudo[78202]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:37 compute-2 sudo[78202]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-2 sudo[78227]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:51:37 compute-2 sudo[78227]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:37 compute-2 sudo[78227]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-2 sudo[78252]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:51:37 compute-2 sudo[78252]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:37 compute-2 sudo[78252]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e36 e36: 2 total, 2 up, 2 in
Jan 23 09:51:37 compute-2 sudo[78300]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:51:37 compute-2 sudo[78300]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:37 compute-2 sudo[78300]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-2 sudo[78325]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:51:37 compute-2 sudo[78325]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:37 compute-2 sudo[78325]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-2 sudo[78350]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Jan 23 09:51:37 compute-2 sudo[78350]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:37 compute-2 sudo[78350]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-2 sudo[78375]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:51:37 compute-2 sudo[78375]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:37 compute-2 sudo[78375]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-2 sudo[78400]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:51:37 compute-2 sudo[78400]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:37 compute-2 sudo[78400]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-2 sudo[78425]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:51:37 compute-2 sudo[78425]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:37 compute-2 sudo[78425]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-2 sudo[78450]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:51:37 compute-2 sudo[78450]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:37 compute-2 sudo[78450]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-2 sudo[78475]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:51:37 compute-2 sudo[78475]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:37 compute-2 sudo[78475]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-2 ceph-mon[75771]: 2.d scrub starts
Jan 23 09:51:37 compute-2 ceph-mon[75771]: 2.d scrub ok
Jan 23 09:51:37 compute-2 ceph-mon[75771]: from='client.14385 -' entity='client.admin' cmd=[{"prefix": "nfs cluster create", "cluster_id": "cephfs", "ingress": true, "virtual_ip": "192.168.122.2/24", "ingress_mode": "haproxy-protocol", "placement": "compute-0 compute-1 compute-2 ", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 09:51:37 compute-2 ceph-mon[75771]: 3.13 scrub starts
Jan 23 09:51:37 compute-2 ceph-mon[75771]: 3.13 scrub ok
Jan 23 09:51:37 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:37 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:37 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 09:51:37 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:37 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Jan 23 09:51:37 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:37 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 09:51:37 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:51:37 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 09:51:37 compute-2 ceph-mon[75771]: Updating compute-0:/etc/ceph/ceph.conf
Jan 23 09:51:37 compute-2 ceph-mon[75771]: Updating compute-1:/etc/ceph/ceph.conf
Jan 23 09:51:37 compute-2 ceph-mon[75771]: Updating compute-2:/etc/ceph/ceph.conf
Jan 23 09:51:37 compute-2 ceph-mon[75771]: pgmap v6: 193 pgs: 193 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:51:37 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]': finished
Jan 23 09:51:37 compute-2 ceph-mon[75771]: osdmap e36: 2 total, 2 up, 2 in
Jan 23 09:51:37 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]: dispatch
Jan 23 09:51:37 compute-2 sudo[78523]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:51:37 compute-2 sudo[78523]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:37 compute-2 sudo[78523]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-2 sudo[78548]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:51:37 compute-2 sudo[78548]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:37 compute-2 sudo[78548]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-2 sudo[78573]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:51:37 compute-2 sudo[78573]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-2 sudo[78573]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:38 compute-2 sudo[78598]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Jan 23 09:51:38 compute-2 sudo[78598]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-2 sudo[78598]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:38 compute-2 sudo[78623]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph
Jan 23 09:51:38 compute-2 sudo[78623]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-2 sudo[78623]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:38 compute-2 sudo[78648]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:51:38 compute-2 sudo[78648]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-2 sudo[78648]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:38 compute-2 sudo[78673]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:51:38 compute-2 sudo[78673]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-2 sudo[78673]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:38 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e37 e37: 2 total, 2 up, 2 in
Jan 23 09:51:38 compute-2 sudo[78698]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:51:38 compute-2 sudo[78698]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-2 sudo[78698]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:38 compute-2 sudo[78746]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:51:38 compute-2 sudo[78746]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-2 sudo[78746]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:38 compute-2 sudo[78771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:51:38 compute-2 sudo[78771]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-2 sudo[78771]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:38 compute-2 sudo[78796]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Jan 23 09:51:38 compute-2 sudo[78796]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-2 sudo[78796]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:38 compute-2 sudo[78821]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:51:38 compute-2 sudo[78821]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-2 sudo[78821]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:38 compute-2 sudo[78846]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:51:38 compute-2 sudo[78846]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-2 sudo[78846]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:38 compute-2 sudo[78871]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:51:38 compute-2 sudo[78871]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-2 sudo[78871]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:38 compute-2 sudo[78896]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:51:38 compute-2 sudo[78896]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-2 sudo[78896]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:38 compute-2 sudo[78921]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:51:38 compute-2 sudo[78921]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-2 sudo[78921]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:38 compute-2 sudo[78969]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:51:38 compute-2 sudo[78969]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-2 sudo[78969]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:38 compute-2 sudo[78994]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:51:38 compute-2 sudo[78994]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-2 sudo[78994]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:39 compute-2 sudo[79019]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:51:39 compute-2 ceph-mon[75771]: 2.e scrub starts
Jan 23 09:51:39 compute-2 ceph-mon[75771]: 2.e scrub ok
Jan 23 09:51:39 compute-2 sudo[79019]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:39 compute-2 ceph-mon[75771]: Updating compute-0:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:51:39 compute-2 ceph-mon[75771]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:51:39 compute-2 ceph-mon[75771]: Updating compute-1:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:51:39 compute-2 ceph-mon[75771]: 3.f scrub starts
Jan 23 09:51:39 compute-2 ceph-mon[75771]: 3.f scrub ok
Jan 23 09:51:39 compute-2 ceph-mon[75771]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 23 09:51:39 compute-2 ceph-mon[75771]: mgrmap e24: compute-0.nbdygh(active, since 5s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:51:39 compute-2 ceph-mon[75771]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Jan 23 09:51:39 compute-2 ceph-mon[75771]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Jan 23 09:51:39 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]': finished
Jan 23 09:51:39 compute-2 ceph-mon[75771]: osdmap e37: 2 total, 2 up, 2 in
Jan 23 09:51:39 compute-2 ceph-mon[75771]: Saving service nfs.cephfs spec with placement compute-0;compute-1;compute-2
Jan 23 09:51:39 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:39 compute-2 ceph-mon[75771]: Saving service ingress.nfs.cephfs spec with placement compute-0;compute-1;compute-2
Jan 23 09:51:39 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:39 compute-2 sudo[79019]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:39 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e38 e38: 2 total, 2 up, 2 in
Jan 23 09:51:39 compute-2 sudo[79044]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:51:39 compute-2 sudo[79044]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:39 compute-2 sudo[79044]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:39 compute-2 sudo[79069]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/prometheus/node-exporter:v1.7.0 --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:51:39 compute-2 sudo[79069]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:40 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e38 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:51:40 compute-2 ceph-mon[75771]: 2.10 scrub starts
Jan 23 09:51:40 compute-2 ceph-mon[75771]: 2.10 scrub ok
Jan 23 09:51:40 compute-2 ceph-mon[75771]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:51:40 compute-2 ceph-mon[75771]: 4.15 scrub starts
Jan 23 09:51:40 compute-2 ceph-mon[75771]: 4.15 scrub ok
Jan 23 09:51:40 compute-2 ceph-mon[75771]: Updating compute-0:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:51:40 compute-2 ceph-mon[75771]: pgmap v9: 194 pgs: 1 unknown, 193 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:51:40 compute-2 ceph-mon[75771]: Updating compute-1:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:51:40 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:40 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:40 compute-2 ceph-mon[75771]: 2.13 scrub starts
Jan 23 09:51:40 compute-2 ceph-mon[75771]: 2.13 scrub ok
Jan 23 09:51:40 compute-2 ceph-mon[75771]: osdmap e38: 2 total, 2 up, 2 in
Jan 23 09:51:40 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:40 compute-2 ceph-mon[75771]: mgrmap e25: compute-0.nbdygh(active, since 6s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:51:40 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:40 compute-2 ceph-mon[75771]: 5.15 scrub starts
Jan 23 09:51:40 compute-2 ceph-mon[75771]: 5.15 scrub ok
Jan 23 09:51:40 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:40 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:40 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:40 compute-2 systemd[1]: Reloading.
Jan 23 09:51:40 compute-2 systemd-sysv-generator[79166]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:51:40 compute-2 systemd-rc-local-generator[79162]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:51:40 compute-2 systemd[1]: Reloading.
Jan 23 09:51:40 compute-2 systemd-rc-local-generator[79202]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:51:40 compute-2 systemd-sysv-generator[79206]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:51:40 compute-2 systemd[1]: Starting Ceph node-exporter.compute-2 for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:51:40 compute-2 bash[79260]: Trying to pull quay.io/prometheus/node-exporter:v1.7.0...
Jan 23 09:51:41 compute-2 ceph-mon[75771]: Deploying daemon node-exporter.compute-2 on compute-2
Jan 23 09:51:41 compute-2 ceph-mon[75771]: 2.15 scrub starts
Jan 23 09:51:41 compute-2 ceph-mon[75771]: 2.15 scrub ok
Jan 23 09:51:41 compute-2 ceph-mon[75771]: 5.16 scrub starts
Jan 23 09:51:41 compute-2 ceph-mon[75771]: 5.16 scrub ok
Jan 23 09:51:41 compute-2 bash[79260]: Getting image source signatures
Jan 23 09:51:41 compute-2 bash[79260]: Copying blob sha256:324153f2810a9927fcce320af9e4e291e0b6e805cbdd1f338386c756b9defa24
Jan 23 09:51:41 compute-2 bash[79260]: Copying blob sha256:2abcce694348cd2c949c0e98a7400ebdfd8341021bcf6b541bc72033ce982510
Jan 23 09:51:41 compute-2 bash[79260]: Copying blob sha256:455fd88e5221bc1e278ef2d059cd70e4df99a24e5af050ede621534276f6cf9a
Jan 23 09:51:43 compute-2 ceph-mon[75771]: pgmap v11: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail; 29 KiB/s rd, 0 B/s wr, 12 op/s
Jan 23 09:51:43 compute-2 ceph-mon[75771]: 2.19 scrub starts
Jan 23 09:51:43 compute-2 ceph-mon[75771]: 2.19 scrub ok
Jan 23 09:51:43 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/992291970' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Jan 23 09:51:43 compute-2 ceph-mon[75771]: 4.13 scrub starts
Jan 23 09:51:43 compute-2 ceph-mon[75771]: 4.13 scrub ok
Jan 23 09:51:43 compute-2 bash[79260]: Copying config sha256:72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e
Jan 23 09:51:43 compute-2 bash[79260]: Writing manifest to image destination
Jan 23 09:51:43 compute-2 podman[79260]: 2026-01-23 09:51:43.983179142 +0000 UTC m=+3.122504866 image pull 72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e quay.io/prometheus/node-exporter:v1.7.0
Jan 23 09:51:44 compute-2 podman[79260]: 2026-01-23 09:51:44.147557358 +0000 UTC m=+3.286883062 container create 610246eaa4f552466bc8eee8cc806d516dfcdb85055d982f77b2276e24631ff0 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:51:44 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47d514d8f55bda982f888d0e7f03ebab6be03e078204b04f7e76c86d45f56d75/merged/etc/node-exporter supports timestamps until 2038 (0x7fffffff)
Jan 23 09:51:44 compute-2 podman[79260]: 2026-01-23 09:51:44.237366071 +0000 UTC m=+3.376691785 container init 610246eaa4f552466bc8eee8cc806d516dfcdb85055d982f77b2276e24631ff0 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:51:44 compute-2 podman[79260]: 2026-01-23 09:51:44.242201258 +0000 UTC m=+3.381526952 container start 610246eaa4f552466bc8eee8cc806d516dfcdb85055d982f77b2276e24631ff0 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:51:44 compute-2 bash[79260]: 610246eaa4f552466bc8eee8cc806d516dfcdb85055d982f77b2276e24631ff0
Jan 23 09:51:44 compute-2 systemd[1]: Started Ceph node-exporter.compute-2 for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.270Z caller=node_exporter.go:192 level=info msg="Starting node_exporter" version="(version=1.7.0, branch=HEAD, revision=7333465abf9efba81876303bb57e6fadb946041b)"
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.270Z caller=node_exporter.go:193 level=info msg="Build context" build_context="(go=go1.21.4, platform=linux/amd64, user=root@35918982f6d8, date=20231112-23:53:35, tags=netgo osusergo static_build)"
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.270Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.270Z caller=diskstats_linux.go:265 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.271Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.271Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=arp
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=bcache
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=bonding
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=btrfs
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=conntrack
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=cpu
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=cpufreq
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=diskstats
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=dmi
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=edac
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=entropy
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=fibrechannel
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=filefd
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=filesystem
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=hwmon
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=infiniband
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=ipvs
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=loadavg
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=mdadm
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=meminfo
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=netclass
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=netdev
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=netstat
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=nfs
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=nfsd
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=nvme
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=os
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=powersupplyclass
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=pressure
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=rapl
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=schedstat
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=selinux
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=sockstat
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=softnet
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=stat
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=tapestats
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=textfile
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=thermal_zone
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=time
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=udp_queues
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=uname
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=vmstat
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=xfs
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=zfs
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.273Z caller=tls_config.go:274 level=info msg="Listening on" address=[::]:9100
Jan 23 09:51:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.273Z caller=tls_config.go:277 level=info msg="TLS is disabled." http2=false address=[::]:9100
Jan 23 09:51:44 compute-2 sudo[79069]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:44 compute-2 ceph-mon[75771]: 7.4 scrub starts
Jan 23 09:51:44 compute-2 ceph-mon[75771]: 7.4 scrub ok
Jan 23 09:51:44 compute-2 ceph-mon[75771]: 5.11 scrub starts
Jan 23 09:51:44 compute-2 ceph-mon[75771]: 5.11 scrub ok
Jan 23 09:51:44 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/992291970' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Jan 23 09:51:44 compute-2 ceph-mon[75771]: pgmap v12: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail; 29 KiB/s rd, 0 B/s wr, 12 op/s
Jan 23 09:51:44 compute-2 ceph-mon[75771]: 7.8 scrub starts
Jan 23 09:51:44 compute-2 ceph-mon[75771]: 7.8 scrub ok
Jan 23 09:51:44 compute-2 ceph-mon[75771]: 3.14 scrub starts
Jan 23 09:51:44 compute-2 ceph-mon[75771]: 3.14 scrub ok
Jan 23 09:51:44 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:44 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1904452043' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Jan 23 09:51:45 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e38 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:51:45 compute-2 sudo[79344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:51:45 compute-2 sudo[79344]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:45 compute-2 sudo[79344]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:45 compute-2 sudo[79369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid f3005f84-239a-55b6-a948-8f1fb592b920 --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 --yes --no-systemd
Jan 23 09:51:45 compute-2 sudo[79369]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:45 compute-2 podman[79432]: 2026-01-23 09:51:45.845432886 +0000 UTC m=+0.038935176 container create be7108f60bdb8f8cf9a1ba541ae5e493a168a337a337e6d498330cc2d461c9c6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=friendly_hamilton, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:51:45 compute-2 systemd[1]: Started libpod-conmon-be7108f60bdb8f8cf9a1ba541ae5e493a168a337a337e6d498330cc2d461c9c6.scope.
Jan 23 09:51:45 compute-2 systemd[1]: Started libcrun container.
Jan 23 09:51:45 compute-2 podman[79432]: 2026-01-23 09:51:45.912305092 +0000 UTC m=+0.105807402 container init be7108f60bdb8f8cf9a1ba541ae5e493a168a337a337e6d498330cc2d461c9c6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=friendly_hamilton, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 09:51:45 compute-2 podman[79432]: 2026-01-23 09:51:45.918478183 +0000 UTC m=+0.111980473 container start be7108f60bdb8f8cf9a1ba541ae5e493a168a337a337e6d498330cc2d461c9c6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=friendly_hamilton, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:51:45 compute-2 friendly_hamilton[79448]: 167 167
Jan 23 09:51:45 compute-2 systemd[1]: libpod-be7108f60bdb8f8cf9a1ba541ae5e493a168a337a337e6d498330cc2d461c9c6.scope: Deactivated successfully.
Jan 23 09:51:45 compute-2 podman[79432]: 2026-01-23 09:51:45.924494918 +0000 UTC m=+0.117997238 container attach be7108f60bdb8f8cf9a1ba541ae5e493a168a337a337e6d498330cc2d461c9c6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=friendly_hamilton, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Jan 23 09:51:45 compute-2 podman[79432]: 2026-01-23 09:51:45.828024754 +0000 UTC m=+0.021527064 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:51:45 compute-2 podman[79432]: 2026-01-23 09:51:45.924774955 +0000 UTC m=+0.118277265 container died be7108f60bdb8f8cf9a1ba541ae5e493a168a337a337e6d498330cc2d461c9c6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=friendly_hamilton, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Jan 23 09:51:45 compute-2 systemd[1]: var-lib-containers-storage-overlay-d4ba7fbffb29616fb715adf85dba1eb366bec505e99f037e54b2485607177553-merged.mount: Deactivated successfully.
Jan 23 09:51:45 compute-2 podman[79432]: 2026-01-23 09:51:45.968737403 +0000 UTC m=+0.162239683 container remove be7108f60bdb8f8cf9a1ba541ae5e493a168a337a337e6d498330cc2d461c9c6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=friendly_hamilton, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 23 09:51:45 compute-2 systemd[1]: libpod-conmon-be7108f60bdb8f8cf9a1ba541ae5e493a168a337a337e6d498330cc2d461c9c6.scope: Deactivated successfully.
Jan 23 09:51:46 compute-2 podman[79471]: 2026-01-23 09:51:46.124803827 +0000 UTC m=+0.040137007 container create 589c9f08c1696c69651ddfcaf0af378fa4e5a67bebed1db0cd2030d4a93f9d4f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=trusting_bose, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 09:51:46 compute-2 systemd[1]: Started libpod-conmon-589c9f08c1696c69651ddfcaf0af378fa4e5a67bebed1db0cd2030d4a93f9d4f.scope.
Jan 23 09:51:46 compute-2 systemd[1]: Started libcrun container.
Jan 23 09:51:46 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbe30176b428903d0592b42422c0ecd0f0bce2f81248b2af5644eb8b05a70154/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 09:51:46 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbe30176b428903d0592b42422c0ecd0f0bce2f81248b2af5644eb8b05a70154/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:51:46 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbe30176b428903d0592b42422c0ecd0f0bce2f81248b2af5644eb8b05a70154/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:51:46 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbe30176b428903d0592b42422c0ecd0f0bce2f81248b2af5644eb8b05a70154/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 09:51:46 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbe30176b428903d0592b42422c0ecd0f0bce2f81248b2af5644eb8b05a70154/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 09:51:46 compute-2 podman[79471]: 2026-01-23 09:51:46.193307642 +0000 UTC m=+0.108640842 container init 589c9f08c1696c69651ddfcaf0af378fa4e5a67bebed1db0cd2030d4a93f9d4f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=trusting_bose, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 09:51:46 compute-2 podman[79471]: 2026-01-23 09:51:46.199903713 +0000 UTC m=+0.115236893 container start 589c9f08c1696c69651ddfcaf0af378fa4e5a67bebed1db0cd2030d4a93f9d4f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=trusting_bose, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:51:46 compute-2 podman[79471]: 2026-01-23 09:51:46.110162851 +0000 UTC m=+0.025496051 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:51:46 compute-2 podman[79471]: 2026-01-23 09:51:46.207531688 +0000 UTC m=+0.122864868 container attach 589c9f08c1696c69651ddfcaf0af378fa4e5a67bebed1db0cd2030d4a93f9d4f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=trusting_bose, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Jan 23 09:51:46 compute-2 ceph-mon[75771]: 7.a scrub starts
Jan 23 09:51:46 compute-2 ceph-mon[75771]: 7.a scrub ok
Jan 23 09:51:46 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:46 compute-2 ceph-mon[75771]: 5.1f deep-scrub starts
Jan 23 09:51:46 compute-2 ceph-mon[75771]: 5.1f deep-scrub ok
Jan 23 09:51:46 compute-2 ceph-mon[75771]: pgmap v13: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail; 23 KiB/s rd, 0 B/s wr, 9 op/s
Jan 23 09:51:46 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:46 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:46 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 09:51:46 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 09:51:46 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:51:46 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 09:51:46 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:51:46 compute-2 trusting_bose[79487]: --> passed data devices: 0 physical, 1 LVM
Jan 23 09:51:46 compute-2 trusting_bose[79487]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 09:51:46 compute-2 trusting_bose[79487]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 09:51:46 compute-2 trusting_bose[79487]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 2edb8fa1-89ea-44cd-9b6e-9f4d89095397
Jan 23 09:51:47 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd new", "uuid": "2edb8fa1-89ea-44cd-9b6e-9f4d89095397"} v 0)
Jan 23 09:51:47 compute-2 ceph-mon[75771]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/1205331151' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "2edb8fa1-89ea-44cd-9b6e-9f4d89095397"}]: dispatch
Jan 23 09:51:47 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e39 e39: 3 total, 2 up, 3 in
Jan 23 09:51:48 compute-2 trusting_bose[79487]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-2
Jan 23 09:51:48 compute-2 trusting_bose[79487]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Jan 23 09:51:48 compute-2 trusting_bose[79487]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 23 09:51:48 compute-2 trusting_bose[79487]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 23 09:51:48 compute-2 lvm[79552]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 09:51:48 compute-2 lvm[79552]: VG ceph_vg0 finished
Jan 23 09:51:48 compute-2 trusting_bose[79487]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-2/activate.monmap
Jan 23 09:51:48 compute-2 ceph-mon[75771]: 7.14 deep-scrub starts
Jan 23 09:51:48 compute-2 ceph-mon[75771]: 7.14 deep-scrub ok
Jan 23 09:51:48 compute-2 ceph-mon[75771]: 5.10 scrub starts
Jan 23 09:51:48 compute-2 ceph-mon[75771]: 5.10 scrub ok
Jan 23 09:51:48 compute-2 ceph-mon[75771]: 7.b scrub starts
Jan 23 09:51:48 compute-2 ceph-mon[75771]: 7.b scrub ok
Jan 23 09:51:48 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/985471869' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 09:51:48 compute-2 ceph-mon[75771]: 6.1e deep-scrub starts
Jan 23 09:51:48 compute-2 ceph-mon[75771]: 6.1e deep-scrub ok
Jan 23 09:51:48 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1205331151' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "2edb8fa1-89ea-44cd-9b6e-9f4d89095397"}]: dispatch
Jan 23 09:51:48 compute-2 ceph-mon[75771]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "2edb8fa1-89ea-44cd-9b6e-9f4d89095397"}]: dispatch
Jan 23 09:51:48 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon getmap"} v 0)
Jan 23 09:51:48 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3212942412' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Jan 23 09:51:48 compute-2 trusting_bose[79487]:  stderr: got monmap epoch 3
Jan 23 09:51:48 compute-2 trusting_bose[79487]: --> Creating keyring file for osd.2
Jan 23 09:51:48 compute-2 trusting_bose[79487]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/keyring
Jan 23 09:51:48 compute-2 trusting_bose[79487]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/
Jan 23 09:51:48 compute-2 trusting_bose[79487]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 2 --monmap /var/lib/ceph/osd/ceph-2/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-2/ --osd-uuid 2edb8fa1-89ea-44cd-9b6e-9f4d89095397 --setuser ceph --setgroup ceph
Jan 23 09:51:49 compute-2 ceph-mon[75771]: pgmap v14: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 0 B/s wr, 8 op/s
Jan 23 09:51:49 compute-2 ceph-mon[75771]: 7.10 scrub starts
Jan 23 09:51:49 compute-2 ceph-mon[75771]: 7.10 scrub ok
Jan 23 09:51:49 compute-2 ceph-mon[75771]: 6.1c scrub starts
Jan 23 09:51:49 compute-2 ceph-mon[75771]: 6.1c scrub ok
Jan 23 09:51:49 compute-2 ceph-mon[75771]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "2edb8fa1-89ea-44cd-9b6e-9f4d89095397"}]': finished
Jan 23 09:51:49 compute-2 ceph-mon[75771]: osdmap e39: 3 total, 2 up, 3 in
Jan 23 09:51:49 compute-2 ceph-mon[75771]: 7.13 scrub starts
Jan 23 09:51:49 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:51:49 compute-2 ceph-mon[75771]: 7.13 scrub ok
Jan 23 09:51:49 compute-2 ceph-mon[75771]: 6.12 scrub starts
Jan 23 09:51:49 compute-2 ceph-mon[75771]: 6.12 scrub ok
Jan 23 09:51:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3212942412' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Jan 23 09:51:49 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:49 compute-2 ceph-mon[75771]: pgmap v16: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:51:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3560526778' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Jan 23 09:51:50 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e39 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:51:51 compute-2 ceph-mon[75771]: 7.1d scrub starts
Jan 23 09:51:51 compute-2 ceph-mon[75771]: 7.1d scrub ok
Jan 23 09:51:51 compute-2 ceph-mon[75771]: 6.17 scrub starts
Jan 23 09:51:51 compute-2 ceph-mon[75771]: 6.17 scrub ok
Jan 23 09:51:52 compute-2 ceph-mon[75771]: 6.15 scrub starts
Jan 23 09:51:52 compute-2 ceph-mon[75771]: 6.15 scrub ok
Jan 23 09:51:52 compute-2 ceph-mon[75771]: pgmap v17: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:51:53 compute-2 ceph-mon[75771]: from='client.14424 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Jan 23 09:51:53 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:53 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:53 compute-2 trusting_bose[79487]:  stderr: 2026-01-23T09:51:48.820+0000 7f0326444740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) No valid bdev label found
Jan 23 09:51:53 compute-2 trusting_bose[79487]:  stderr: 2026-01-23T09:51:49.082+0000 7f0326444740 -1 bluestore(/var/lib/ceph/osd/ceph-2/) _read_fsid unparsable uuid
Jan 23 09:51:53 compute-2 trusting_bose[79487]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Jan 23 09:51:53 compute-2 trusting_bose[79487]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 23 09:51:53 compute-2 trusting_bose[79487]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Jan 23 09:51:53 compute-2 trusting_bose[79487]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 23 09:51:53 compute-2 trusting_bose[79487]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Jan 23 09:51:53 compute-2 trusting_bose[79487]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 23 09:51:53 compute-2 trusting_bose[79487]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 23 09:51:53 compute-2 trusting_bose[79487]: --> ceph-volume lvm activate successful for osd ID: 2
Jan 23 09:51:53 compute-2 trusting_bose[79487]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Jan 23 09:51:53 compute-2 systemd[1]: libpod-589c9f08c1696c69651ddfcaf0af378fa4e5a67bebed1db0cd2030d4a93f9d4f.scope: Deactivated successfully.
Jan 23 09:51:53 compute-2 podman[79471]: 2026-01-23 09:51:53.638161115 +0000 UTC m=+7.553494295 container died 589c9f08c1696c69651ddfcaf0af378fa4e5a67bebed1db0cd2030d4a93f9d4f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=trusting_bose, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Jan 23 09:51:53 compute-2 systemd[1]: libpod-589c9f08c1696c69651ddfcaf0af378fa4e5a67bebed1db0cd2030d4a93f9d4f.scope: Consumed 3.976s CPU time.
Jan 23 09:51:53 compute-2 systemd[1]: var-lib-containers-storage-overlay-cbe30176b428903d0592b42422c0ecd0f0bce2f81248b2af5644eb8b05a70154-merged.mount: Deactivated successfully.
Jan 23 09:51:53 compute-2 podman[79471]: 2026-01-23 09:51:53.720536508 +0000 UTC m=+7.635869688 container remove 589c9f08c1696c69651ddfcaf0af378fa4e5a67bebed1db0cd2030d4a93f9d4f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=trusting_bose, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1)
Jan 23 09:51:53 compute-2 systemd[1]: libpod-conmon-589c9f08c1696c69651ddfcaf0af378fa4e5a67bebed1db0cd2030d4a93f9d4f.scope: Deactivated successfully.
Jan 23 09:51:53 compute-2 sudo[79369]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:53 compute-2 sudo[80473]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:51:53 compute-2 sudo[80473]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:53 compute-2 sudo[80473]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:53 compute-2 sudo[80498]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid f3005f84-239a-55b6-a948-8f1fb592b920 -- lvm list --format json
Jan 23 09:51:53 compute-2 sudo[80498]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:54 compute-2 podman[80562]: 2026-01-23 09:51:54.279903823 +0000 UTC m=+0.042832181 container create 5b2080acc631c1cc0b8eb238fc7df026d3c90f64b5cb878c1921b92e7bf1d5e5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_tharp, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325)
Jan 23 09:51:54 compute-2 systemd[1]: Started libpod-conmon-5b2080acc631c1cc0b8eb238fc7df026d3c90f64b5cb878c1921b92e7bf1d5e5.scope.
Jan 23 09:51:54 compute-2 systemd[1]: Started libcrun container.
Jan 23 09:51:54 compute-2 podman[80562]: 2026-01-23 09:51:54.355322626 +0000 UTC m=+0.118251014 container init 5b2080acc631c1cc0b8eb238fc7df026d3c90f64b5cb878c1921b92e7bf1d5e5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_tharp, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 09:51:54 compute-2 podman[80562]: 2026-01-23 09:51:54.260056631 +0000 UTC m=+0.022985019 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:51:54 compute-2 podman[80562]: 2026-01-23 09:51:54.363308991 +0000 UTC m=+0.126237349 container start 5b2080acc631c1cc0b8eb238fc7df026d3c90f64b5cb878c1921b92e7bf1d5e5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_tharp, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325)
Jan 23 09:51:54 compute-2 hardcore_tharp[80581]: 167 167
Jan 23 09:51:54 compute-2 systemd[1]: libpod-5b2080acc631c1cc0b8eb238fc7df026d3c90f64b5cb878c1921b92e7bf1d5e5.scope: Deactivated successfully.
Jan 23 09:51:54 compute-2 conmon[80581]: conmon 5b2080acc631c1cc0b8e <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-5b2080acc631c1cc0b8eb238fc7df026d3c90f64b5cb878c1921b92e7bf1d5e5.scope/container/memory.events
Jan 23 09:51:54 compute-2 podman[80562]: 2026-01-23 09:51:54.370951236 +0000 UTC m=+0.133879594 container attach 5b2080acc631c1cc0b8eb238fc7df026d3c90f64b5cb878c1921b92e7bf1d5e5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_tharp, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Jan 23 09:51:54 compute-2 podman[80562]: 2026-01-23 09:51:54.371311956 +0000 UTC m=+0.134240314 container died 5b2080acc631c1cc0b8eb238fc7df026d3c90f64b5cb878c1921b92e7bf1d5e5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_tharp, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_REF=squid)
Jan 23 09:51:54 compute-2 systemd[1]: var-lib-containers-storage-overlay-65f4ac93920b8b2fcc5dce3a3607ee4121ba998292d3a5b2d6e7e331d3bc65a0-merged.mount: Deactivated successfully.
Jan 23 09:51:54 compute-2 podman[80562]: 2026-01-23 09:51:54.439205855 +0000 UTC m=+0.202134213 container remove 5b2080acc631c1cc0b8eb238fc7df026d3c90f64b5cb878c1921b92e7bf1d5e5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_tharp, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 09:51:54 compute-2 systemd[1]: libpod-conmon-5b2080acc631c1cc0b8eb238fc7df026d3c90f64b5cb878c1921b92e7bf1d5e5.scope: Deactivated successfully.
Jan 23 09:51:54 compute-2 ceph-mon[75771]: pgmap v18: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:51:54 compute-2 podman[80605]: 2026-01-23 09:51:54.614828745 +0000 UTC m=+0.048443629 container create 1e34501b689359b4a380b772c1fce82799193b754b5197e4ae0a7d2382179b1e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_clarke, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Jan 23 09:51:54 compute-2 systemd[1]: Started libpod-conmon-1e34501b689359b4a380b772c1fce82799193b754b5197e4ae0a7d2382179b1e.scope.
Jan 23 09:51:54 compute-2 podman[80605]: 2026-01-23 09:51:54.595150476 +0000 UTC m=+0.028765370 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:51:54 compute-2 systemd[1]: Started libcrun container.
Jan 23 09:51:54 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e1ca2ea7a5b2d8517e78735b6ceaa0e6e58f570e813fa6ea07f63f6dbb9d27b5/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 09:51:54 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e1ca2ea7a5b2d8517e78735b6ceaa0e6e58f570e813fa6ea07f63f6dbb9d27b5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:51:54 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e1ca2ea7a5b2d8517e78735b6ceaa0e6e58f570e813fa6ea07f63f6dbb9d27b5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:51:54 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e1ca2ea7a5b2d8517e78735b6ceaa0e6e58f570e813fa6ea07f63f6dbb9d27b5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 09:51:54 compute-2 podman[80605]: 2026-01-23 09:51:54.735228831 +0000 UTC m=+0.168843725 container init 1e34501b689359b4a380b772c1fce82799193b754b5197e4ae0a7d2382179b1e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_clarke, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 09:51:54 compute-2 podman[80605]: 2026-01-23 09:51:54.744461395 +0000 UTC m=+0.178076269 container start 1e34501b689359b4a380b772c1fce82799193b754b5197e4ae0a7d2382179b1e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_clarke, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default)
Jan 23 09:51:54 compute-2 podman[80605]: 2026-01-23 09:51:54.748988635 +0000 UTC m=+0.182603509 container attach 1e34501b689359b4a380b772c1fce82799193b754b5197e4ae0a7d2382179b1e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_clarke, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 09:51:55 compute-2 great_clarke[80621]: {
Jan 23 09:51:55 compute-2 great_clarke[80621]:     "2": [
Jan 23 09:51:55 compute-2 great_clarke[80621]:         {
Jan 23 09:51:55 compute-2 great_clarke[80621]:             "devices": [
Jan 23 09:51:55 compute-2 great_clarke[80621]:                 "/dev/loop3"
Jan 23 09:51:55 compute-2 great_clarke[80621]:             ],
Jan 23 09:51:55 compute-2 great_clarke[80621]:             "lv_name": "ceph_lv0",
Jan 23 09:51:55 compute-2 great_clarke[80621]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Jan 23 09:51:55 compute-2 great_clarke[80621]:             "lv_size": "21470642176",
Jan 23 09:51:55 compute-2 great_clarke[80621]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=2wFOwd-HcwO-2lSY-8RBi-SMwa-NPkg-tiq3o8,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=f3005f84-239a-55b6-a948-8f1fb592b920,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=2edb8fa1-89ea-44cd-9b6e-9f4d89095397,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Jan 23 09:51:55 compute-2 great_clarke[80621]:             "lv_uuid": "2wFOwd-HcwO-2lSY-8RBi-SMwa-NPkg-tiq3o8",
Jan 23 09:51:55 compute-2 great_clarke[80621]:             "name": "ceph_lv0",
Jan 23 09:51:55 compute-2 great_clarke[80621]:             "path": "/dev/ceph_vg0/ceph_lv0",
Jan 23 09:51:55 compute-2 great_clarke[80621]:             "tags": {
Jan 23 09:51:55 compute-2 great_clarke[80621]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Jan 23 09:51:55 compute-2 great_clarke[80621]:                 "ceph.block_uuid": "2wFOwd-HcwO-2lSY-8RBi-SMwa-NPkg-tiq3o8",
Jan 23 09:51:55 compute-2 great_clarke[80621]:                 "ceph.cephx_lockbox_secret": "",
Jan 23 09:51:55 compute-2 great_clarke[80621]:                 "ceph.cluster_fsid": "f3005f84-239a-55b6-a948-8f1fb592b920",
Jan 23 09:51:55 compute-2 great_clarke[80621]:                 "ceph.cluster_name": "ceph",
Jan 23 09:51:55 compute-2 great_clarke[80621]:                 "ceph.crush_device_class": "",
Jan 23 09:51:55 compute-2 great_clarke[80621]:                 "ceph.encrypted": "0",
Jan 23 09:51:55 compute-2 great_clarke[80621]:                 "ceph.osd_fsid": "2edb8fa1-89ea-44cd-9b6e-9f4d89095397",
Jan 23 09:51:55 compute-2 great_clarke[80621]:                 "ceph.osd_id": "2",
Jan 23 09:51:55 compute-2 great_clarke[80621]:                 "ceph.osdspec_affinity": "default_drive_group",
Jan 23 09:51:55 compute-2 great_clarke[80621]:                 "ceph.type": "block",
Jan 23 09:51:55 compute-2 great_clarke[80621]:                 "ceph.vdo": "0",
Jan 23 09:51:55 compute-2 great_clarke[80621]:                 "ceph.with_tpm": "0"
Jan 23 09:51:55 compute-2 great_clarke[80621]:             },
Jan 23 09:51:55 compute-2 great_clarke[80621]:             "type": "block",
Jan 23 09:51:55 compute-2 great_clarke[80621]:             "vg_name": "ceph_vg0"
Jan 23 09:51:55 compute-2 great_clarke[80621]:         }
Jan 23 09:51:55 compute-2 great_clarke[80621]:     ]
Jan 23 09:51:55 compute-2 great_clarke[80621]: }
Jan 23 09:51:55 compute-2 systemd[1]: libpod-1e34501b689359b4a380b772c1fce82799193b754b5197e4ae0a7d2382179b1e.scope: Deactivated successfully.
Jan 23 09:51:55 compute-2 podman[80605]: 2026-01-23 09:51:55.110526391 +0000 UTC m=+0.544141275 container died 1e34501b689359b4a380b772c1fce82799193b754b5197e4ae0a7d2382179b1e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_clarke, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 23 09:51:55 compute-2 systemd[1]: var-lib-containers-storage-overlay-e1ca2ea7a5b2d8517e78735b6ceaa0e6e58f570e813fa6ea07f63f6dbb9d27b5-merged.mount: Deactivated successfully.
Jan 23 09:51:55 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e39 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:51:55 compute-2 podman[80605]: 2026-01-23 09:51:55.165987978 +0000 UTC m=+0.599602852 container remove 1e34501b689359b4a380b772c1fce82799193b754b5197e4ae0a7d2382179b1e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_clarke, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Jan 23 09:51:55 compute-2 systemd[1]: libpod-conmon-1e34501b689359b4a380b772c1fce82799193b754b5197e4ae0a7d2382179b1e.scope: Deactivated successfully.
Jan 23 09:51:55 compute-2 sudo[80498]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:55 compute-2 sudo[80641]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:51:55 compute-2 sudo[80641]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:55 compute-2 sudo[80641]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:55 compute-2 sudo[80666]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:51:55 compute-2 sudo[80666]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:55 compute-2 podman[80734]: 2026-01-23 09:51:55.780695695 +0000 UTC m=+0.045407369 container create 7cabe110c3b957c02879fe36945f105f1bd31899cf8ff83590326e646c8b5266 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_sanderson, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 09:51:55 compute-2 systemd[1]: Started libpod-conmon-7cabe110c3b957c02879fe36945f105f1bd31899cf8ff83590326e646c8b5266.scope.
Jan 23 09:51:55 compute-2 systemd[1]: Started libcrun container.
Jan 23 09:51:55 compute-2 podman[80734]: 2026-01-23 09:51:55.755824424 +0000 UTC m=+0.020536128 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:51:55 compute-2 podman[80734]: 2026-01-23 09:51:55.863493262 +0000 UTC m=+0.128204966 container init 7cabe110c3b957c02879fe36945f105f1bd31899cf8ff83590326e646c8b5266 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_sanderson, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 09:51:55 compute-2 podman[80734]: 2026-01-23 09:51:55.871696257 +0000 UTC m=+0.136407941 container start 7cabe110c3b957c02879fe36945f105f1bd31899cf8ff83590326e646c8b5266 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_sanderson, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, ceph=True, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 09:51:55 compute-2 podman[80734]: 2026-01-23 09:51:55.875425986 +0000 UTC m=+0.140137690 container attach 7cabe110c3b957c02879fe36945f105f1bd31899cf8ff83590326e646c8b5266 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_sanderson, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2)
Jan 23 09:51:55 compute-2 brave_sanderson[80750]: 167 167
Jan 23 09:51:55 compute-2 systemd[1]: libpod-7cabe110c3b957c02879fe36945f105f1bd31899cf8ff83590326e646c8b5266.scope: Deactivated successfully.
Jan 23 09:51:55 compute-2 podman[80734]: 2026-01-23 09:51:55.879628226 +0000 UTC m=+0.144339900 container died 7cabe110c3b957c02879fe36945f105f1bd31899cf8ff83590326e646c8b5266 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_sanderson, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Jan 23 09:51:55 compute-2 systemd[1]: var-lib-containers-storage-overlay-3edd05407961d56181044db07bf2b7fd31c013f92930dfe7345aa8510c633b3e-merged.mount: Deactivated successfully.
Jan 23 09:51:56 compute-2 podman[80734]: 2026-01-23 09:51:56.01823654 +0000 UTC m=+0.282948214 container remove 7cabe110c3b957c02879fe36945f105f1bd31899cf8ff83590326e646c8b5266 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_sanderson, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:51:56 compute-2 systemd[1]: libpod-conmon-7cabe110c3b957c02879fe36945f105f1bd31899cf8ff83590326e646c8b5266.scope: Deactivated successfully.
Jan 23 09:51:56 compute-2 podman[80781]: 2026-01-23 09:51:56.270103685 +0000 UTC m=+0.043677759 container create cfa5b3a306552e3a46183377894b08efb034c3ad936be745e252328e9a254514 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate-test, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid)
Jan 23 09:51:56 compute-2 systemd[1]: Started libpod-conmon-cfa5b3a306552e3a46183377894b08efb034c3ad936be745e252328e9a254514.scope.
Jan 23 09:51:56 compute-2 podman[80781]: 2026-01-23 09:51:56.253081691 +0000 UTC m=+0.026655765 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:51:56 compute-2 systemd[1]: Started libcrun container.
Jan 23 09:51:56 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/70b3598d9336858d8d429b6f0ae6a9d341eb1d40de1759b181ae9a8224d1bf90/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 09:51:56 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/70b3598d9336858d8d429b6f0ae6a9d341eb1d40de1759b181ae9a8224d1bf90/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:51:56 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/70b3598d9336858d8d429b6f0ae6a9d341eb1d40de1759b181ae9a8224d1bf90/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:51:56 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/70b3598d9336858d8d429b6f0ae6a9d341eb1d40de1759b181ae9a8224d1bf90/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 09:51:56 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/70b3598d9336858d8d429b6f0ae6a9d341eb1d40de1759b181ae9a8224d1bf90/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Jan 23 09:51:56 compute-2 podman[80781]: 2026-01-23 09:51:56.385061997 +0000 UTC m=+0.158636101 container init cfa5b3a306552e3a46183377894b08efb034c3ad936be745e252328e9a254514 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate-test, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Jan 23 09:51:56 compute-2 podman[80781]: 2026-01-23 09:51:56.39237176 +0000 UTC m=+0.165945834 container start cfa5b3a306552e3a46183377894b08efb034c3ad936be745e252328e9a254514 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate-test, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 09:51:56 compute-2 podman[80781]: 2026-01-23 09:51:56.413743228 +0000 UTC m=+0.187317302 container attach cfa5b3a306552e3a46183377894b08efb034c3ad936be745e252328e9a254514 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate-test, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 09:51:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate-test[80797]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_FSID]
Jan 23 09:51:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate-test[80797]:                             [--no-systemd] [--no-tmpfs]
Jan 23 09:51:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate-test[80797]: ceph-volume activate: error: unrecognized arguments: --bad-option
Jan 23 09:51:56 compute-2 systemd[1]: libpod-cfa5b3a306552e3a46183377894b08efb034c3ad936be745e252328e9a254514.scope: Deactivated successfully.
Jan 23 09:51:56 compute-2 podman[80781]: 2026-01-23 09:51:56.575535203 +0000 UTC m=+0.349109287 container died cfa5b3a306552e3a46183377894b08efb034c3ad936be745e252328e9a254514 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate-test, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 09:51:56 compute-2 ceph-mon[75771]: pgmap v19: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:51:56 compute-2 ceph-mon[75771]: from='client.14430 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Jan 23 09:51:56 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Jan 23 09:51:56 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:51:56 compute-2 ceph-mon[75771]: Deploying daemon osd.2 on compute-2
Jan 23 09:51:56 compute-2 systemd[1]: var-lib-containers-storage-overlay-70b3598d9336858d8d429b6f0ae6a9d341eb1d40de1759b181ae9a8224d1bf90-merged.mount: Deactivated successfully.
Jan 23 09:51:56 compute-2 podman[80781]: 2026-01-23 09:51:56.615505342 +0000 UTC m=+0.389079426 container remove cfa5b3a306552e3a46183377894b08efb034c3ad936be745e252328e9a254514 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate-test, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Jan 23 09:51:56 compute-2 systemd[1]: libpod-conmon-cfa5b3a306552e3a46183377894b08efb034c3ad936be745e252328e9a254514.scope: Deactivated successfully.
Jan 23 09:51:56 compute-2 systemd[1]: Reloading.
Jan 23 09:51:56 compute-2 systemd-rc-local-generator[80856]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:51:56 compute-2 systemd-sysv-generator[80860]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:51:57 compute-2 systemd[1]: Reloading.
Jan 23 09:51:57 compute-2 systemd-rc-local-generator[80893]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:51:57 compute-2 systemd-sysv-generator[80897]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:51:57 compute-2 systemd[1]: Starting Ceph osd.2 for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:51:57 compute-2 podman[80955]: 2026-01-23 09:51:57.706816975 +0000 UTC m=+0.040776050 container create cfdc7362c938cf1fdf58c6f7b2188cd40a747c025b0ba29f9707a03cde81f642 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:51:57 compute-2 systemd[1]: Started libcrun container.
Jan 23 09:51:57 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee275f3653f7db06acf72ad0d0779d48fb272855acbf54673ba74ab31a020a8e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 09:51:57 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee275f3653f7db06acf72ad0d0779d48fb272855acbf54673ba74ab31a020a8e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:51:57 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee275f3653f7db06acf72ad0d0779d48fb272855acbf54673ba74ab31a020a8e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:51:57 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee275f3653f7db06acf72ad0d0779d48fb272855acbf54673ba74ab31a020a8e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 09:51:57 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee275f3653f7db06acf72ad0d0779d48fb272855acbf54673ba74ab31a020a8e/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Jan 23 09:51:57 compute-2 podman[80955]: 2026-01-23 09:51:57.687421583 +0000 UTC m=+0.021380698 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:51:57 compute-2 podman[80955]: 2026-01-23 09:51:57.800117862 +0000 UTC m=+0.134076947 container init cfdc7362c938cf1fdf58c6f7b2188cd40a747c025b0ba29f9707a03cde81f642 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Jan 23 09:51:57 compute-2 podman[80955]: 2026-01-23 09:51:57.806226447 +0000 UTC m=+0.140185512 container start cfdc7362c938cf1fdf58c6f7b2188cd40a747c025b0ba29f9707a03cde81f642 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 09:51:57 compute-2 podman[80955]: 2026-01-23 09:51:57.809676018 +0000 UTC m=+0.143635123 container attach cfdc7362c938cf1fdf58c6f7b2188cd40a747c025b0ba29f9707a03cde81f642 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1)
Jan 23 09:51:57 compute-2 ceph-mon[75771]: pgmap v20: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:51:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate[80969]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 09:51:57 compute-2 bash[80955]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 09:51:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate[80969]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 09:51:58 compute-2 bash[80955]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 09:51:58 compute-2 lvm[81051]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 09:51:58 compute-2 lvm[81051]: VG ceph_vg0 finished
Jan 23 09:51:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate[80969]: --> Failed to activate via raw: did not find any matching OSD to activate
Jan 23 09:51:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate[80969]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 09:51:58 compute-2 bash[80955]: --> Failed to activate via raw: did not find any matching OSD to activate
Jan 23 09:51:58 compute-2 bash[80955]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 09:51:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate[80969]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 09:51:58 compute-2 bash[80955]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 09:51:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate[80969]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 23 09:51:58 compute-2 bash[80955]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 23 09:51:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate[80969]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Jan 23 09:51:58 compute-2 bash[80955]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Jan 23 09:51:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate[80969]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 23 09:51:59 compute-2 bash[80955]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 23 09:51:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate[80969]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Jan 23 09:51:59 compute-2 bash[80955]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Jan 23 09:51:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate[80969]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 23 09:51:59 compute-2 bash[80955]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 23 09:51:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate[80969]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 23 09:51:59 compute-2 bash[80955]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 23 09:51:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate[80969]: --> ceph-volume lvm activate successful for osd ID: 2
Jan 23 09:51:59 compute-2 bash[80955]: --> ceph-volume lvm activate successful for osd ID: 2
Jan 23 09:51:59 compute-2 systemd[1]: libpod-cfdc7362c938cf1fdf58c6f7b2188cd40a747c025b0ba29f9707a03cde81f642.scope: Deactivated successfully.
Jan 23 09:51:59 compute-2 podman[80955]: 2026-01-23 09:51:59.085489255 +0000 UTC m=+1.419448330 container died cfdc7362c938cf1fdf58c6f7b2188cd40a747c025b0ba29f9707a03cde81f642 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 09:51:59 compute-2 systemd[1]: libpod-cfdc7362c938cf1fdf58c6f7b2188cd40a747c025b0ba29f9707a03cde81f642.scope: Consumed 1.359s CPU time.
Jan 23 09:51:59 compute-2 systemd[1]: var-lib-containers-storage-overlay-ee275f3653f7db06acf72ad0d0779d48fb272855acbf54673ba74ab31a020a8e-merged.mount: Deactivated successfully.
Jan 23 09:51:59 compute-2 podman[80955]: 2026-01-23 09:51:59.140363679 +0000 UTC m=+1.474322754 container remove cfdc7362c938cf1fdf58c6f7b2188cd40a747c025b0ba29f9707a03cde81f642 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 09:51:59 compute-2 podman[81211]: 2026-01-23 09:51:59.352491569 +0000 UTC m=+0.046417854 container create 2fe854da164488d1d6d6dc2938b8a7e9e3832f2f3f02511f769527c4b75bc72f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 09:51:59 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36d9800ec1622e14d7b1636fbece62a9ba7a23105468dbe37503c04383eeaa8d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 09:51:59 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36d9800ec1622e14d7b1636fbece62a9ba7a23105468dbe37503c04383eeaa8d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:51:59 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36d9800ec1622e14d7b1636fbece62a9ba7a23105468dbe37503c04383eeaa8d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:51:59 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36d9800ec1622e14d7b1636fbece62a9ba7a23105468dbe37503c04383eeaa8d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 09:51:59 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36d9800ec1622e14d7b1636fbece62a9ba7a23105468dbe37503c04383eeaa8d/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Jan 23 09:51:59 compute-2 podman[81211]: 2026-01-23 09:51:59.40635481 +0000 UTC m=+0.100281095 container init 2fe854da164488d1d6d6dc2938b8a7e9e3832f2f3f02511f769527c4b75bc72f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid)
Jan 23 09:51:59 compute-2 podman[81211]: 2026-01-23 09:51:59.41521809 +0000 UTC m=+0.109144375 container start 2fe854da164488d1d6d6dc2938b8a7e9e3832f2f3f02511f769527c4b75bc72f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Jan 23 09:51:59 compute-2 bash[81211]: 2fe854da164488d1d6d6dc2938b8a7e9e3832f2f3f02511f769527c4b75bc72f
Jan 23 09:51:59 compute-2 podman[81211]: 2026-01-23 09:51:59.331280415 +0000 UTC m=+0.025206700 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:51:59 compute-2 systemd[1]: Started Ceph osd.2 for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:51:59 compute-2 ceph-osd[81231]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 09:51:59 compute-2 ceph-osd[81231]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-osd, pid 2
Jan 23 09:51:59 compute-2 ceph-osd[81231]: pidfile_write: ignore empty --pid-file
Jan 23 09:51:59 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 09:51:59 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 09:51:59 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:51:59 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:51:59 compute-2 sudo[80666]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:59 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 09:51:59 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 09:51:59 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 09:51:59 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:51:59 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:51:59 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 09:51:59 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 09:51:59 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 09:51:59 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:51:59 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:51:59 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 09:51:59 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 09:51:59 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 09:51:59 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:51:59 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:51:59 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bdev(0x559222c1b800 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bdev(0x559222c1b800 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bdev(0x559222c1b800 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bdev(0x559222c1b800 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bdev(0x559222c1b800 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 09:52:00 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e39 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 09:52:00 compute-2 ceph-osd[81231]: starting osd.2 osd_data /var/lib/ceph/osd/ceph-2 /var/lib/ceph/osd/ceph-2/journal
Jan 23 09:52:00 compute-2 ceph-osd[81231]: load: jerasure load: lrc 
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 09:52:00 compute-2 ceph-mon[75771]: pgmap v21: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 23 09:52:00 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 09:52:01 compute-2 ceph-osd[81231]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Jan 23 09:52:01 compute-2 ceph-osd[81231]: osd.2:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 09:52:01 compute-2 sudo[81282]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:52:01 compute-2 sudo[81282]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:52:01 compute-2 sudo[81282]: pam_unix(sudo:session): session closed for user root
Jan 23 09:52:01 compute-2 sudo[81307]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid f3005f84-239a-55b6-a948-8f1fb592b920 -- raw list --format json
Jan 23 09:52:01 compute-2 sudo[81307]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a93000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a93000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a93000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a93000 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluefs mount
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluefs mount shared_bdev_used = 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: RocksDB version: 7.9.2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Git sha 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Compile date 2025-07-17 03:12:14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: DB SUMMARY
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: DB Session ID:  BA6EM20ZQ6TLD6WYVFWO
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: CURRENT file:  CURRENT
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: IDENTITY file:  IDENTITY
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                         Options.error_if_exists: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                       Options.create_if_missing: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                         Options.paranoid_checks: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                                     Options.env: 0x559222c6e770
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                                Options.info_log: 0x559223a979e0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.max_file_opening_threads: 16
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                              Options.statistics: (nil)
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                               Options.use_fsync: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                       Options.max_log_file_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                         Options.allow_fallocate: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.use_direct_reads: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.create_missing_column_families: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                              Options.db_log_dir: 
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                                 Options.wal_dir: db.wal
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.advise_random_on_open: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.write_buffer_manager: 0x559223b82a00
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                            Options.rate_limiter: (nil)
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.unordered_write: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                               Options.row_cache: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                              Options.wal_filter: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.allow_ingest_behind: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.two_write_queues: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.manual_wal_flush: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.wal_compression: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.atomic_flush: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.log_readahead_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.allow_data_in_errors: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.db_host_id: __hostname__
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.max_background_jobs: 4
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.max_background_compactions: -1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.max_subcompactions: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.max_open_files: -1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.bytes_per_sync: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.max_background_flushes: -1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Compression algorithms supported:
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         kZSTD supported: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         kXpressCompression supported: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         kBZip2Compression supported: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         kZSTDNotFinalCompression supported: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         kLZ4Compression supported: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         kZlibCompression supported: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         kLZ4HCCompression supported: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         kSnappyCompression supported: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a97da0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559222cb1350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a97da0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559222cb1350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a97da0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559222cb1350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a97da0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559222cb1350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a97da0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559222cb1350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a97da0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559222cb1350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a97da0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559222cb1350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a97dc0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559222cb09b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a97dc0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559222cb09b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a97dc0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559222cb09b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: b695fe5f-810b-4432-8e0d-0ba463e0cde8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161921737516, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161921737828, "job": 1, "event": "recovery_finished"}
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old nid_max 1025
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old blobid_max 10240
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta min_alloc_size 0x1000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: freelist init
Jan 23 09:52:01 compute-2 ceph-osd[81231]: freelist _read_cfg
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluefs umount
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a93000 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a93000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a93000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a93000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bdev(0x559223a93000 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluefs mount
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluefs mount shared_bdev_used = 4718592
Jan 23 09:52:01 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: RocksDB version: 7.9.2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Git sha 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Compile date 2025-07-17 03:12:14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: DB SUMMARY
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: DB Session ID:  BA6EM20ZQ6TLD6WYVFWP
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: CURRENT file:  CURRENT
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: IDENTITY file:  IDENTITY
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                         Options.error_if_exists: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                       Options.create_if_missing: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                         Options.paranoid_checks: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                                     Options.env: 0x559222c6ed90
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                                Options.info_log: 0x559223c42760
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.max_file_opening_threads: 16
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                              Options.statistics: (nil)
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                               Options.use_fsync: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                       Options.max_log_file_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                         Options.allow_fallocate: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.use_direct_reads: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.create_missing_column_families: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                              Options.db_log_dir: 
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                                 Options.wal_dir: db.wal
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.advise_random_on_open: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.write_buffer_manager: 0x559223b82a00
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                            Options.rate_limiter: (nil)
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.unordered_write: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                               Options.row_cache: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                              Options.wal_filter: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.allow_ingest_behind: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.two_write_queues: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.manual_wal_flush: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.wal_compression: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.atomic_flush: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.log_readahead_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.allow_data_in_errors: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.db_host_id: __hostname__
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.max_background_jobs: 4
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.max_background_compactions: -1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.max_subcompactions: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.max_open_files: -1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.bytes_per_sync: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.max_background_flushes: -1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Compression algorithms supported:
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         kZSTD supported: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         kXpressCompression supported: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         kBZip2Compression supported: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         kZSTDNotFinalCompression supported: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         kLZ4Compression supported: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         kZlibCompression supported: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         kLZ4HCCompression supported: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         kSnappyCompression supported: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a978c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559222cb1350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a978c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559222cb1350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a978c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559222cb1350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a978c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559222cb1350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a978c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559222cb1350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:52:01 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a978c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559222cb1350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a978c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559222cb1350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a97d00)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559222cb09b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a97d00)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559222cb09b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a97d00)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559222cb09b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: b695fe5f-810b-4432-8e0d-0ba463e0cde8
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161922007715, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161922012650, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161922, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "b695fe5f-810b-4432-8e0d-0ba463e0cde8", "db_session_id": "BA6EM20ZQ6TLD6WYVFWP", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161922016477, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1595, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 469, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 571, "raw_average_value_size": 285, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161922, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "b695fe5f-810b-4432-8e0d-0ba463e0cde8", "db_session_id": "BA6EM20ZQ6TLD6WYVFWP", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161922021452, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161922, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "b695fe5f-810b-4432-8e0d-0ba463e0cde8", "db_session_id": "BA6EM20ZQ6TLD6WYVFWP", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161922024567, "job": 1, "event": "recovery_finished"}
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x559223dee000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: DB pointer 0x559223dd0000
Jan 23 09:52:02 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 23 09:52:02 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super from 4, latest 4
Jan 23 09:52:02 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super done
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 09:52:02 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb09b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb09b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb09b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Jan 23 09:52:02 compute-2 ceph-osd[81231]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Jan 23 09:52:02 compute-2 ceph-osd[81231]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/hello/cls_hello.cc:316: loading cls_hello
Jan 23 09:52:02 compute-2 ceph-osd[81231]: _get_class not permitted to load lua
Jan 23 09:52:02 compute-2 ceph-osd[81231]: _get_class not permitted to load sdk
Jan 23 09:52:02 compute-2 ceph-osd[81231]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Jan 23 09:52:02 compute-2 ceph-osd[81231]: osd.2 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Jan 23 09:52:02 compute-2 ceph-osd[81231]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Jan 23 09:52:02 compute-2 ceph-osd[81231]: osd.2 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Jan 23 09:52:02 compute-2 ceph-osd[81231]: osd.2 0 load_pgs
Jan 23 09:52:02 compute-2 ceph-osd[81231]: osd.2 0 load_pgs opened 0 pgs
Jan 23 09:52:02 compute-2 ceph-osd[81231]: osd.2 0 log_to_monitors true
Jan 23 09:52:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2[81227]: 2026-01-23T09:52:02.056+0000 7f0b54478740 -1 osd.2 0 log_to_monitors true
Jan 23 09:52:02 compute-2 podman[81740]: 2026-01-23 09:52:02.067796802 +0000 UTC m=+0.043122186 container create dff58781b88df0d396af51ead4fcda5420dcb9b234607b1ddd2c4800e6217e69 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_meninsky, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, io.buildah.version=1.40.1, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 09:52:02 compute-2 podman[81740]: 2026-01-23 09:52:02.046369002 +0000 UTC m=+0.021694416 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:52:02 compute-2 systemd[1]: Started libpod-conmon-dff58781b88df0d396af51ead4fcda5420dcb9b234607b1ddd2c4800e6217e69.scope.
Jan 23 09:52:02 compute-2 systemd[1]: Started libcrun container.
Jan 23 09:52:02 compute-2 podman[81740]: 2026-01-23 09:52:02.359593315 +0000 UTC m=+0.334918729 container init dff58781b88df0d396af51ead4fcda5420dcb9b234607b1ddd2c4800e6217e69 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_meninsky, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 09:52:02 compute-2 podman[81740]: 2026-01-23 09:52:02.367057742 +0000 UTC m=+0.342383126 container start dff58781b88df0d396af51ead4fcda5420dcb9b234607b1ddd2c4800e6217e69 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_meninsky, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Jan 23 09:52:02 compute-2 podman[81740]: 2026-01-23 09:52:02.371463967 +0000 UTC m=+0.346789371 container attach dff58781b88df0d396af51ead4fcda5420dcb9b234607b1ddd2c4800e6217e69 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_meninsky, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 09:52:02 compute-2 jolly_meninsky[81789]: 167 167
Jan 23 09:52:02 compute-2 systemd[1]: libpod-dff58781b88df0d396af51ead4fcda5420dcb9b234607b1ddd2c4800e6217e69.scope: Deactivated successfully.
Jan 23 09:52:02 compute-2 podman[81740]: 2026-01-23 09:52:02.372615244 +0000 UTC m=+0.347940628 container died dff58781b88df0d396af51ead4fcda5420dcb9b234607b1ddd2c4800e6217e69 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_meninsky, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 09:52:02 compute-2 systemd[1]: var-lib-containers-storage-overlay-250279687aeb2cd740afdfbb55d22bf5e55e159c702a5771f0b6df12a1c879b1-merged.mount: Deactivated successfully.
Jan 23 09:52:02 compute-2 podman[81740]: 2026-01-23 09:52:02.442777651 +0000 UTC m=+0.418103035 container remove dff58781b88df0d396af51ead4fcda5420dcb9b234607b1ddd2c4800e6217e69 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_meninsky, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Jan 23 09:52:02 compute-2 systemd[1]: libpod-conmon-dff58781b88df0d396af51ead4fcda5420dcb9b234607b1ddd2c4800e6217e69.scope: Deactivated successfully.
Jan 23 09:52:02 compute-2 ceph-mon[75771]: from='client.14439 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Jan 23 09:52:02 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:02 compute-2 ceph-mon[75771]: pgmap v22: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:52:02 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:02 compute-2 podman[81815]: 2026-01-23 09:52:02.592042509 +0000 UTC m=+0.039637033 container create 5f9c2070c21515a857c445d3c2108add9bc71d52845d31b03f8a20035765ce1a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=naughty_booth, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Jan 23 09:52:02 compute-2 systemd[1]: Started libpod-conmon-5f9c2070c21515a857c445d3c2108add9bc71d52845d31b03f8a20035765ce1a.scope.
Jan 23 09:52:02 compute-2 systemd[1]: Started libcrun container.
Jan 23 09:52:02 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd736dbfd702450691a7eef8bb914199da96d67d4d8df264e07f53afa050b113/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:02 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd736dbfd702450691a7eef8bb914199da96d67d4d8df264e07f53afa050b113/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:02 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd736dbfd702450691a7eef8bb914199da96d67d4d8df264e07f53afa050b113/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:02 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd736dbfd702450691a7eef8bb914199da96d67d4d8df264e07f53afa050b113/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:02 compute-2 podman[81815]: 2026-01-23 09:52:02.574157163 +0000 UTC m=+0.021751607 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:52:02 compute-2 podman[81815]: 2026-01-23 09:52:02.671595399 +0000 UTC m=+0.119189843 container init 5f9c2070c21515a857c445d3c2108add9bc71d52845d31b03f8a20035765ce1a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=naughty_booth, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid)
Jan 23 09:52:02 compute-2 podman[81815]: 2026-01-23 09:52:02.684329002 +0000 UTC m=+0.131923426 container start 5f9c2070c21515a857c445d3c2108add9bc71d52845d31b03f8a20035765ce1a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=naughty_booth, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 09:52:02 compute-2 podman[81815]: 2026-01-23 09:52:02.730196332 +0000 UTC m=+0.177790776 container attach 5f9c2070c21515a857c445d3c2108add9bc71d52845d31b03f8a20035765ce1a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=naughty_booth, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.build-date=20250325)
Jan 23 09:52:02 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]} v 0)
Jan 23 09:52:02 compute-2 ceph-mon[75771]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/1020282776,v1:192.168.122.102:6801/1020282776]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 23 09:52:03 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Jan 23 09:52:03 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Jan 23 09:52:03 compute-2 lvm[81904]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 09:52:03 compute-2 lvm[81904]: VG ceph_vg0 finished
Jan 23 09:52:03 compute-2 naughty_booth[81831]: {}
Jan 23 09:52:03 compute-2 systemd[1]: libpod-5f9c2070c21515a857c445d3c2108add9bc71d52845d31b03f8a20035765ce1a.scope: Deactivated successfully.
Jan 23 09:52:03 compute-2 systemd[1]: libpod-5f9c2070c21515a857c445d3c2108add9bc71d52845d31b03f8a20035765ce1a.scope: Consumed 1.107s CPU time.
Jan 23 09:52:03 compute-2 podman[81815]: 2026-01-23 09:52:03.411798667 +0000 UTC m=+0.859393111 container died 5f9c2070c21515a857c445d3c2108add9bc71d52845d31b03f8a20035765ce1a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=naughty_booth, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Jan 23 09:52:03 compute-2 systemd[1]: var-lib-containers-storage-overlay-cd736dbfd702450691a7eef8bb914199da96d67d4d8df264e07f53afa050b113-merged.mount: Deactivated successfully.
Jan 23 09:52:03 compute-2 podman[81815]: 2026-01-23 09:52:03.54484295 +0000 UTC m=+0.992437374 container remove 5f9c2070c21515a857c445d3c2108add9bc71d52845d31b03f8a20035765ce1a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=naughty_booth, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 09:52:03 compute-2 systemd[1]: libpod-conmon-5f9c2070c21515a857c445d3c2108add9bc71d52845d31b03f8a20035765ce1a.scope: Deactivated successfully.
Jan 23 09:52:03 compute-2 sudo[81307]: pam_unix(sudo:session): session closed for user root
Jan 23 09:52:03 compute-2 ceph-mon[75771]: from='osd.2 [v2:192.168.122.102:6800/1020282776,v1:192.168.122.102:6801/1020282776]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 23 09:52:03 compute-2 ceph-mon[75771]: pgmap v23: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:52:03 compute-2 ceph-mon[75771]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 23 09:52:04 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e40 e40: 3 total, 2 up, 3 in
Jan 23 09:52:04 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]} v 0)
Jan 23 09:52:04 compute-2 ceph-mon[75771]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/1020282776,v1:192.168.122.102:6801/1020282776]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 23 09:52:05 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e40 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:52:07 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e41 e41: 3 total, 2 up, 3 in
Jan 23 09:52:07 compute-2 ceph-mon[75771]: purged_snaps scrub starts
Jan 23 09:52:07 compute-2 ceph-mon[75771]: purged_snaps scrub ok
Jan 23 09:52:07 compute-2 ceph-mon[75771]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Jan 23 09:52:07 compute-2 ceph-mon[75771]: from='osd.2 [v2:192.168.122.102:6800/1020282776,v1:192.168.122.102:6801/1020282776]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 23 09:52:07 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:07 compute-2 ceph-mon[75771]: osdmap e40: 3 total, 2 up, 3 in
Jan 23 09:52:07 compute-2 ceph-mon[75771]: pgmap v25: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:52:07 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:07 compute-2 ceph-mon[75771]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 23 09:52:07 compute-2 ceph-osd[81231]: osd.2 0 done with init, starting boot process
Jan 23 09:52:07 compute-2 ceph-osd[81231]: osd.2 0 start_boot
Jan 23 09:52:07 compute-2 ceph-osd[81231]: osd.2 0 maybe_override_options_for_qos osd_max_backfills set to 1
Jan 23 09:52:07 compute-2 ceph-osd[81231]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Jan 23 09:52:07 compute-2 ceph-osd[81231]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Jan 23 09:52:07 compute-2 ceph-osd[81231]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Jan 23 09:52:07 compute-2 ceph-osd[81231]: osd.2 0  bench count 12288000 bsize 4 KiB
Jan 23 09:52:07 compute-2 sudo[81921]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:52:07 compute-2 sudo[81921]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:52:07 compute-2 sudo[81921]: pam_unix(sudo:session): session closed for user root
Jan 23 09:52:07 compute-2 sudo[81946]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:52:07 compute-2 sudo[81946]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:52:07 compute-2 podman[82009]: 2026-01-23 09:52:07.854793444 +0000 UTC m=+0.039866969 container create bccb3c5c85df11e48a5e9073a3030819a1f765eb1201f05c97a7895048a44586 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_maxwell, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 09:52:07 compute-2 systemd[1]: Started libpod-conmon-bccb3c5c85df11e48a5e9073a3030819a1f765eb1201f05c97a7895048a44586.scope.
Jan 23 09:52:07 compute-2 podman[82009]: 2026-01-23 09:52:07.837273087 +0000 UTC m=+0.022346642 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:52:07 compute-2 systemd[1]: Started libcrun container.
Jan 23 09:52:07 compute-2 podman[82009]: 2026-01-23 09:52:07.989225338 +0000 UTC m=+0.174298863 container init bccb3c5c85df11e48a5e9073a3030819a1f765eb1201f05c97a7895048a44586 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_maxwell, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_REF=squid, io.buildah.version=1.40.1)
Jan 23 09:52:07 compute-2 podman[82009]: 2026-01-23 09:52:07.996802178 +0000 UTC m=+0.181875703 container start bccb3c5c85df11e48a5e9073a3030819a1f765eb1201f05c97a7895048a44586 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_maxwell, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Jan 23 09:52:08 compute-2 great_maxwell[82024]: 167 167
Jan 23 09:52:08 compute-2 systemd[1]: libpod-bccb3c5c85df11e48a5e9073a3030819a1f765eb1201f05c97a7895048a44586.scope: Deactivated successfully.
Jan 23 09:52:08 compute-2 podman[82009]: 2026-01-23 09:52:08.017791517 +0000 UTC m=+0.202865042 container attach bccb3c5c85df11e48a5e9073a3030819a1f765eb1201f05c97a7895048a44586 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_maxwell, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 09:52:08 compute-2 podman[82009]: 2026-01-23 09:52:08.01918445 +0000 UTC m=+0.204257975 container died bccb3c5c85df11e48a5e9073a3030819a1f765eb1201f05c97a7895048a44586 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_maxwell, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, io.buildah.version=1.40.1)
Jan 23 09:52:08 compute-2 systemd[1]: var-lib-containers-storage-overlay-78fa74605695d3d171647444fa256abe097174f2b854c46c3f6d5605141fb40f-merged.mount: Deactivated successfully.
Jan 23 09:52:08 compute-2 podman[82009]: 2026-01-23 09:52:08.172238166 +0000 UTC m=+0.357311691 container remove bccb3c5c85df11e48a5e9073a3030819a1f765eb1201f05c97a7895048a44586 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_maxwell, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Jan 23 09:52:08 compute-2 systemd[1]: libpod-conmon-bccb3c5c85df11e48a5e9073a3030819a1f765eb1201f05c97a7895048a44586.scope: Deactivated successfully.
Jan 23 09:52:08 compute-2 systemd[1]: Reloading.
Jan 23 09:52:08 compute-2 systemd-rc-local-generator[82073]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:52:08 compute-2 systemd-sysv-generator[82077]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:52:08 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:08 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.yzflfx", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 09:52:08 compute-2 ceph-mon[75771]: pgmap v26: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:52:08 compute-2 ceph-mon[75771]: from='client.14445 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Jan 23 09:52:08 compute-2 ceph-mon[75771]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]': finished
Jan 23 09:52:08 compute-2 ceph-mon[75771]: osdmap e41: 3 total, 2 up, 3 in
Jan 23 09:52:08 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:08 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.yzflfx", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 09:52:08 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:08 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:52:08 compute-2 ceph-mon[75771]: Deploying daemon rgw.rgw.compute-2.yzflfx on compute-2
Jan 23 09:52:08 compute-2 systemd[1]: Reloading.
Jan 23 09:52:08 compute-2 systemd-rc-local-generator[82112]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:52:08 compute-2 systemd-sysv-generator[82115]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:52:09 compute-2 systemd[1]: Starting Ceph rgw.rgw.compute-2.yzflfx for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:52:09 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:09 compute-2 ceph-mon[75771]: pgmap v28: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:52:09 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:10 compute-2 podman[82166]: 2026-01-23 09:52:10.040330037 +0000 UTC m=+0.067842353 container create acdcd7af422f7f2705bf22df75cd33bfe41cd9d5b73262aeba6616edae8c78b9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-rgw-rgw-compute-2-yzflfx, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 09:52:10 compute-2 podman[82166]: 2026-01-23 09:52:09.997316295 +0000 UTC m=+0.024828611 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:52:10 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e41 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:52:10 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/97256027b59122fa8a8bc9321d1cf77dbcaa9b23d28b00f4296d4f441bda5c05/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:10 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/97256027b59122fa8a8bc9321d1cf77dbcaa9b23d28b00f4296d4f441bda5c05/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:10 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/97256027b59122fa8a8bc9321d1cf77dbcaa9b23d28b00f4296d4f441bda5c05/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:10 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/97256027b59122fa8a8bc9321d1cf77dbcaa9b23d28b00f4296d4f441bda5c05/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-2.yzflfx supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:10 compute-2 podman[82166]: 2026-01-23 09:52:10.178450059 +0000 UTC m=+0.205962405 container init acdcd7af422f7f2705bf22df75cd33bfe41cd9d5b73262aeba6616edae8c78b9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-rgw-rgw-compute-2-yzflfx, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 09:52:10 compute-2 podman[82166]: 2026-01-23 09:52:10.188741294 +0000 UTC m=+0.216253610 container start acdcd7af422f7f2705bf22df75cd33bfe41cd9d5b73262aeba6616edae8c78b9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-rgw-rgw-compute-2-yzflfx, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 09:52:10 compute-2 bash[82166]: acdcd7af422f7f2705bf22df75cd33bfe41cd9d5b73262aeba6616edae8c78b9
Jan 23 09:52:10 compute-2 systemd[1]: Started Ceph rgw.rgw.compute-2.yzflfx for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:52:10 compute-2 radosgw[82185]: deferred set uid:gid to 167:167 (ceph:ceph)
Jan 23 09:52:10 compute-2 radosgw[82185]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process radosgw, pid 2
Jan 23 09:52:10 compute-2 radosgw[82185]: framework: beast
Jan 23 09:52:10 compute-2 radosgw[82185]: framework conf key: endpoint, val: 192.168.122.102:8082
Jan 23 09:52:10 compute-2 radosgw[82185]: init_numa not setting numa affinity
Jan 23 09:52:10 compute-2 sudo[81946]: pam_unix(sudo:session): session closed for user root
Jan 23 09:52:11 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/237302038' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Jan 23 09:52:11 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:11 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:11 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:11 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:11 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.syfcuk", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 09:52:11 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.syfcuk", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 09:52:11 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:11 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:52:15 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e42 e42: 3 total, 2 up, 3 in
Jan 23 09:52:15 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"} v 0)
Jan 23 09:52:15 compute-2 ceph-mon[75771]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/2692084146' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 23 09:52:15 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:52:15 compute-2 ceph-mon[75771]: Deploying daemon rgw.rgw.compute-1.syfcuk on compute-1
Jan 23 09:52:15 compute-2 ceph-mon[75771]: pgmap v29: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:52:15 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:16 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e43 e43: 3 total, 2 up, 3 in
Jan 23 09:52:17 compute-2 ceph-osd[81231]: osd.2 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 8.114 iops: 2077.197 elapsed_sec: 1.444
Jan 23 09:52:17 compute-2 ceph-osd[81231]: log_channel(cluster) log [WRN] : OSD bench result of 2077.197482 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 23 09:52:17 compute-2 ceph-osd[81231]: osd.2 0 waiting for initial osdmap
Jan 23 09:52:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2[81227]: 2026-01-23T09:52:17.437+0000 7f0b50c0e640 -1 osd.2 0 waiting for initial osdmap
Jan 23 09:52:17 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:17 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2988268721' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Jan 23 09:52:17 compute-2 ceph-mon[75771]: pgmap v30: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:52:17 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:17 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:17 compute-2 ceph-mon[75771]: osdmap e42: 3 total, 2 up, 3 in
Jan 23 09:52:17 compute-2 ceph-mon[75771]: pgmap v32: 195 pgs: 1 unknown, 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:52:17 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2692084146' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 23 09:52:17 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:17 compute-2 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 23 09:52:17 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:17 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:17 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:17 compute-2 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Jan 23 09:52:17 compute-2 ceph-mon[75771]: osdmap e43: 3 total, 2 up, 3 in
Jan 23 09:52:17 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:17 compute-2 ceph-osd[81231]: osd.2 40 crush map has features 288514051259236352, adjusting msgr requires for clients
Jan 23 09:52:17 compute-2 ceph-osd[81231]: osd.2 40 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Jan 23 09:52:17 compute-2 ceph-osd[81231]: osd.2 40 crush map has features 3314933000852226048, adjusting msgr requires for osds
Jan 23 09:52:17 compute-2 ceph-osd[81231]: osd.2 40 check_osdmap_features require_osd_release unknown -> squid
Jan 23 09:52:17 compute-2 ceph-osd[81231]: osd.2 43 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 23 09:52:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2[81227]: 2026-01-23T09:52:17.484+0000 7f0b4ba23640 -1 osd.2 43 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 23 09:52:17 compute-2 ceph-osd[81231]: osd.2 43 set_numa_affinity not setting numa affinity
Jan 23 09:52:17 compute-2 ceph-osd[81231]: osd.2 43 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial no unique device path for loop3: no symlink to loop3 in /dev/disk/by-path
Jan 23 09:52:18 compute-2 ceph-osd[81231]: osd.2 43 tick checking mon for new map
Jan 23 09:52:18 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e44 e44: 3 total, 2 up, 3 in
Jan 23 09:52:19 compute-2 ceph-mon[75771]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 23 09:52:19 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:19 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.jbpfwf", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 09:52:19 compute-2 ceph-mon[75771]: pgmap v34: 195 pgs: 1 unknown, 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:52:19 compute-2 ceph-mon[75771]: OSD bench result of 2077.197482 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 23 09:52:19 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:19 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.jbpfwf", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 09:52:19 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1421940163' entity='client.admin' cmd=[{"prefix": "osd get-require-min-compat-client"}]: dispatch
Jan 23 09:52:19 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e45 e45: 3 total, 3 up, 3 in
Jan 23 09:52:19 compute-2 ceph-osd[81231]: osd.2 45 state: booting -> active
Jan 23 09:52:20 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e45 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[4.19( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[6.1b( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[4.1c( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[3.8( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[2.1b( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[3.1b( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[4.6( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[4.2( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[5.0( empty local-lis/les=0/0 n=0 ec=16/16 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[3.0( empty local-lis/les=0/0 n=0 ec=14/14 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[6.1( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[2.a( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[5.d( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[2.d( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[4.1d( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[2.c( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[7.a( empty local-lis/les=0/0 n=0 ec=27/18 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[5.b( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[5.8( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[2.10( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[7.14( empty local-lis/les=0/0 n=0 ec=27/18 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[4.14( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[2.13( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[2.15( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[5.12( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[5.13( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[7.1d( empty local-lis/les=0/0 n=0 ec=27/18 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:20 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[4.3( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:21 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e46 e46: 3 total, 3 up, 3 in
Jan 23 09:52:21 compute-2 ceph-mon[75771]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 23 09:52:21 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:21 compute-2 ceph-mon[75771]: pgmap v35: 195 pgs: 195 active+clean; 450 KiB data, 54 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 511 B/s wr, 1 op/s
Jan 23 09:52:21 compute-2 ceph-mon[75771]: osdmap e44: 3 total, 2 up, 3 in
Jan 23 09:52:21 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:21 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:21 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:21 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:52:21 compute-2 ceph-mon[75771]: Deploying daemon rgw.rgw.compute-0.jbpfwf on compute-0
Jan 23 09:52:21 compute-2 ceph-mon[75771]: osd.2 [v2:192.168.122.102:6800/1020282776,v1:192.168.122.102:6801/1020282776] boot
Jan 23 09:52:21 compute-2 ceph-mon[75771]: osdmap e45: 3 total, 3 up, 3 in
Jan 23 09:52:21 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=45/46 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=45/46 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[7.1d( empty local-lis/les=45/46 n=0 ec=27/18 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.14( empty local-lis/les=45/46 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[5.b( empty local-lis/les=45/46 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.10( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[7.14( empty local-lis/les=45/46 n=0 ec=27/18 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[5.8( empty local-lis/les=45/46 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.c( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=45/46 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.0( empty local-lis/les=45/46 n=0 ec=14/14 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.19( empty local-lis/les=45/46 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[6.1b( empty local-lis/les=45/46 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=45/46 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.1b( empty local-lis/les=45/46 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=45/46 n=0 ec=27/18 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.6( empty local-lis/les=45/46 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.3( empty local-lis/les=45/46 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.2( empty local-lis/les=45/46 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[6.1( empty local-lis/les=45/46 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[5.0( empty local-lis/les=45/46 n=0 ec=16/16 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[5.d( empty local-lis/les=45/46 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.1d( empty local-lis/les=45/46 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"} v 0)
Jan 23 09:52:22 compute-2 ceph-mon[75771]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/3935157835' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 23 09:52:22 compute-2 ceph-mon[75771]: pgmap v38: 195 pgs: 28 peering, 167 active+clean; 450 KiB data, 480 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 682 B/s wr, 1 op/s
Jan 23 09:52:22 compute-2 ceph-mon[75771]: osdmap e46: 3 total, 3 up, 3 in
Jan 23 09:52:22 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3935157835' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 23 09:52:22 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1572426654' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 23 09:52:22 compute-2 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 23 09:52:22 compute-2 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[7.1f( empty local-lis/les=0/0 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=0 lpr=46 pi=[27,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.1f( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.15( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[6.1e( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[6.12( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[7.16( empty local-lis/les=0/0 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=0 lpr=46 pi=[27,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.12( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.15( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=0 lpr=46 pi=[27,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=0 lpr=46 pi=[27,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.9( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.1a( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[5.e( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[7.1f( empty local-lis/les=45/46 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=0 lpr=46 pi=[27,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[6.1c( empty local-lis/les=45/46 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.1f( empty local-lis/les=45/46 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[6.1e( empty local-lis/les=45/46 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[6.12( empty local-lis/les=45/46 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[6.17( empty local-lis/les=45/46 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.15( empty local-lis/les=45/46 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[7.16( empty local-lis/les=45/46 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=0 lpr=46 pi=[27,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=45/46 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=0 lpr=46 pi=[27,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.9( empty local-lis/les=45/46 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=45/46 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.12( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.8( empty local-lis/les=45/46 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=45/46 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.9( empty local-lis/les=45/46 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=45/46 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=0 lpr=46 pi=[27,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=45/46 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e47 e47: 3 total, 3 up, 3 in
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.1a( empty local-lis/les=45/46 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[5.e( empty local-lis/les=45/46 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.15( empty local-lis/les=45/46 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=45/46 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=45/46 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=45/46 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:24 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e48 e48: 3 total, 3 up, 3 in
Jan 23 09:52:24 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1010663506' entity='client.admin' cmd=[{"prefix": "versions", "format": "json"}]: dispatch
Jan 23 09:52:24 compute-2 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Jan 23 09:52:24 compute-2 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Jan 23 09:52:24 compute-2 ceph-mon[75771]: osdmap e47: 3 total, 3 up, 3 in
Jan 23 09:52:24 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:24 compute-2 ceph-mon[75771]: pgmap v41: 196 pgs: 1 unknown, 28 peering, 167 active+clean; 450 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:52:25 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e48 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:52:25 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e49 e49: 3 total, 3 up, 3 in
Jan 23 09:52:25 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:25 compute-2 ceph-mon[75771]: osdmap e48: 3 total, 3 up, 3 in
Jan 23 09:52:25 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:25 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0)
Jan 23 09:52:25 compute-2 ceph-mon[75771]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/3935157835' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 09:52:26 compute-2 sudo[82772]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:52:26 compute-2 sudo[82772]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:52:26 compute-2 sudo[82772]: pam_unix(sudo:session): session closed for user root
Jan 23 09:52:26 compute-2 sudo[82797]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:52:26 compute-2 sudo[82797]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:52:27 compute-2 ceph-mon[75771]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 23 09:52:27 compute-2 ceph-mon[75771]: pgmap v43: 196 pgs: 1 creating+peering, 28 peering, 167 active+clean; 450 KiB data, 481 MiB used, 60 GiB / 60 GiB avail; 5.0 KiB/s rd, 795 B/s wr, 6 op/s
Jan 23 09:52:27 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:27 compute-2 ceph-mon[75771]: osdmap e49: 3 total, 3 up, 3 in
Jan 23 09:52:27 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3935157835' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 09:52:27 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1572426654' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 09:52:27 compute-2 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 09:52:27 compute-2 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 09:52:27 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/681468082' entity='client.rgw.rgw.compute-0.jbpfwf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 09:52:27 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:27 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.prgzmm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 23 09:52:27 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.prgzmm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 23 09:52:27 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:52:27 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e50 e50: 3 total, 3 up, 3 in
Jan 23 09:52:27 compute-2 podman[82863]: 2026-01-23 09:52:27.848485607 +0000 UTC m=+0.042035579 container create b252e74bdf91393d1ef6772b1e8edffdc86c94d2f4cc508aa66400b413fbc621 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_euclid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Jan 23 09:52:27 compute-2 systemd[1]: Started libpod-conmon-b252e74bdf91393d1ef6772b1e8edffdc86c94d2f4cc508aa66400b413fbc621.scope.
Jan 23 09:52:27 compute-2 systemd[1]: Started libcrun container.
Jan 23 09:52:27 compute-2 podman[82863]: 2026-01-23 09:52:27.829426714 +0000 UTC m=+0.022976706 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:52:27 compute-2 podman[82863]: 2026-01-23 09:52:27.952891378 +0000 UTC m=+0.146441370 container init b252e74bdf91393d1ef6772b1e8edffdc86c94d2f4cc508aa66400b413fbc621 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_euclid, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 09:52:27 compute-2 podman[82863]: 2026-01-23 09:52:27.960658473 +0000 UTC m=+0.154208445 container start b252e74bdf91393d1ef6772b1e8edffdc86c94d2f4cc508aa66400b413fbc621 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_euclid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1)
Jan 23 09:52:27 compute-2 podman[82863]: 2026-01-23 09:52:27.964687269 +0000 UTC m=+0.158237231 container attach b252e74bdf91393d1ef6772b1e8edffdc86c94d2f4cc508aa66400b413fbc621 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_euclid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 09:52:27 compute-2 vigilant_euclid[82879]: 167 167
Jan 23 09:52:27 compute-2 systemd[1]: libpod-b252e74bdf91393d1ef6772b1e8edffdc86c94d2f4cc508aa66400b413fbc621.scope: Deactivated successfully.
Jan 23 09:52:27 compute-2 podman[82863]: 2026-01-23 09:52:27.968332185 +0000 UTC m=+0.161882167 container died b252e74bdf91393d1ef6772b1e8edffdc86c94d2f4cc508aa66400b413fbc621 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_euclid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0)
Jan 23 09:52:28 compute-2 systemd[1]: var-lib-containers-storage-overlay-1a1cd03d6952b18e3d893771895d0a5cc146194b0b66dbf1b838e954a3c9347d-merged.mount: Deactivated successfully.
Jan 23 09:52:28 compute-2 podman[82863]: 2026-01-23 09:52:28.033909413 +0000 UTC m=+0.227459385 container remove b252e74bdf91393d1ef6772b1e8edffdc86c94d2f4cc508aa66400b413fbc621 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_euclid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 09:52:28 compute-2 systemd[1]: libpod-conmon-b252e74bdf91393d1ef6772b1e8edffdc86c94d2f4cc508aa66400b413fbc621.scope: Deactivated successfully.
Jan 23 09:52:28 compute-2 systemd[1]: Reloading.
Jan 23 09:52:28 compute-2 systemd-sysv-generator[82925]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:52:28 compute-2 systemd-rc-local-generator[82919]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:52:28 compute-2 ceph-mon[75771]: Deploying daemon mds.cephfs.compute-2.prgzmm on compute-2
Jan 23 09:52:28 compute-2 ceph-mon[75771]: pgmap v45: 197 pgs: 1 unknown, 1 creating+peering, 195 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 4.7 KiB/s rd, 744 B/s wr, 6 op/s
Jan 23 09:52:28 compute-2 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 23 09:52:28 compute-2 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 23 09:52:28 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/681468082' entity='client.rgw.rgw.compute-0.jbpfwf' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 23 09:52:28 compute-2 ceph-mon[75771]: osdmap e50: 3 total, 3 up, 3 in
Jan 23 09:52:28 compute-2 systemd[1]: Reloading.
Jan 23 09:52:28 compute-2 systemd-rc-local-generator[82962]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:52:28 compute-2 systemd-sysv-generator[82965]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:52:28 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e51 e51: 3 total, 3 up, 3 in
Jan 23 09:52:28 compute-2 systemd[1]: Starting Ceph mds.cephfs.compute-2.prgzmm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:52:29 compute-2 podman[83020]: 2026-01-23 09:52:29.297214602 +0000 UTC m=+0.066349488 container create a773ba3ad4991e41d239798ec097b4bbf1907c18732274fc30c903f6eda5a6f2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mds-cephfs-compute-2-prgzmm, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 09:52:29 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b637cac10d75ade71c1ac4f1d68eb4ec9651c03a72730fe42c20c1d2a5060beb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:29 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b637cac10d75ade71c1ac4f1d68eb4ec9651c03a72730fe42c20c1d2a5060beb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:29 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b637cac10d75ade71c1ac4f1d68eb4ec9651c03a72730fe42c20c1d2a5060beb/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:29 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b637cac10d75ade71c1ac4f1d68eb4ec9651c03a72730fe42c20c1d2a5060beb/merged/var/lib/ceph/mds/ceph-cephfs.compute-2.prgzmm supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:29 compute-2 podman[83020]: 2026-01-23 09:52:29.27611423 +0000 UTC m=+0.045249136 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:52:29 compute-2 podman[83020]: 2026-01-23 09:52:29.518971192 +0000 UTC m=+0.288106108 container init a773ba3ad4991e41d239798ec097b4bbf1907c18732274fc30c903f6eda5a6f2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mds-cephfs-compute-2-prgzmm, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 09:52:29 compute-2 podman[83020]: 2026-01-23 09:52:29.524459602 +0000 UTC m=+0.293594488 container start a773ba3ad4991e41d239798ec097b4bbf1907c18732274fc30c903f6eda5a6f2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mds-cephfs-compute-2-prgzmm, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:52:29 compute-2 bash[83020]: a773ba3ad4991e41d239798ec097b4bbf1907c18732274fc30c903f6eda5a6f2
Jan 23 09:52:29 compute-2 systemd[1]: Started Ceph mds.cephfs.compute-2.prgzmm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:52:29 compute-2 ceph-mds[83039]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 09:52:29 compute-2 ceph-mds[83039]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mds, pid 2
Jan 23 09:52:29 compute-2 ceph-mds[83039]: main not setting numa affinity
Jan 23 09:52:29 compute-2 ceph-mds[83039]: pidfile_write: ignore empty --pid-file
Jan 23 09:52:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mds-cephfs-compute-2-prgzmm[83035]: starting mds.cephfs.compute-2.prgzmm at 
Jan 23 09:52:29 compute-2 sudo[82797]: pam_unix(sudo:session): session closed for user root
Jan 23 09:52:29 compute-2 ceph-mon[75771]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 23 09:52:29 compute-2 ceph-mon[75771]: osdmap e51: 3 total, 3 up, 3 in
Jan 23 09:52:29 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:29 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm Updating MDS map to version 2 from mon.1
Jan 23 09:52:30 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:52:30 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e52 e52: 3 total, 3 up, 3 in
Jan 23 09:52:30 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0)
Jan 23 09:52:30 compute-2 ceph-mon[75771]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/3935157835' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 09:52:30 compute-2 ceph-mon[75771]: pgmap v48: 197 pgs: 197 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:52:30 compute-2 ceph-mon[75771]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 23 09:52:30 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:30 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:30 compute-2 ceph-mon[75771]: osdmap e52: 3 total, 3 up, 3 in
Jan 23 09:52:30 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3935157835' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 09:52:30 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1572426654' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 09:52:30 compute-2 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 09:52:30 compute-2 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 09:52:31 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).mds e3 new map
Jan 23 09:52:31 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).mds e3 print_map
                                           e3
                                           btime 2026-01-23T09:52:30:834166+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        2
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-01-23T09:51:34.000760+0000
                                           modified        2026-01-23T09:51:34.000760+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        
                                           up        {}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 0 members: 
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-2.prgzmm{-1:24193} state up:standby seq 1 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]
Jan 23 09:52:31 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm Updating MDS map to version 3 from mon.1
Jan 23 09:52:31 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm Monitors have assigned me to become a standby
Jan 23 09:52:31 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).mds e4 new map
Jan 23 09:52:31 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).mds e4 print_map
                                           e4
                                           btime 2026-01-23T09:52:31:070018+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        4
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-01-23T09:51:34.000760+0000
                                           modified        2026-01-23T09:52:31.070004+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24193}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 0 members: 
                                           [mds.cephfs.compute-2.prgzmm{0:24193} state up:creating seq 1 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
Jan 23 09:52:31 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm Updating MDS map to version 4 from mon.1
Jan 23 09:52:31 compute-2 ceph-mds[83039]: mds.0.4 handle_mds_map I am now mds.0.4
Jan 23 09:52:31 compute-2 ceph-mds[83039]: mds.0.4 handle_mds_map state change up:standby --> up:creating
Jan 23 09:52:31 compute-2 ceph-mds[83039]: mds.0.cache creating system inode with ino:0x1
Jan 23 09:52:31 compute-2 ceph-mds[83039]: mds.0.cache creating system inode with ino:0x100
Jan 23 09:52:31 compute-2 ceph-mds[83039]: mds.0.cache creating system inode with ino:0x600
Jan 23 09:52:31 compute-2 ceph-mds[83039]: mds.0.cache creating system inode with ino:0x601
Jan 23 09:52:31 compute-2 ceph-mds[83039]: mds.0.cache creating system inode with ino:0x602
Jan 23 09:52:31 compute-2 ceph-mds[83039]: mds.0.cache creating system inode with ino:0x603
Jan 23 09:52:31 compute-2 ceph-mds[83039]: mds.0.cache creating system inode with ino:0x604
Jan 23 09:52:31 compute-2 ceph-mds[83039]: mds.0.cache creating system inode with ino:0x605
Jan 23 09:52:31 compute-2 ceph-mds[83039]: mds.0.cache creating system inode with ino:0x606
Jan 23 09:52:31 compute-2 ceph-mds[83039]: mds.0.cache creating system inode with ino:0x607
Jan 23 09:52:31 compute-2 ceph-mds[83039]: mds.0.cache creating system inode with ino:0x608
Jan 23 09:52:31 compute-2 ceph-mds[83039]: mds.0.cache creating system inode with ino:0x609
Jan 23 09:52:31 compute-2 ceph-mds[83039]: mds.0.4 creating_done
Jan 23 09:52:31 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e53 e53: 3 total, 3 up, 3 in
Jan 23 09:52:31 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0)
Jan 23 09:52:31 compute-2 ceph-mon[75771]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/3935157835' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 09:52:32 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/681468082' entity='client.rgw.rgw.compute-0.jbpfwf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 09:52:32 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:32 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.ymknms", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 23 09:52:32 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.ymknms", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 23 09:52:32 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:52:32 compute-2 ceph-mon[75771]: Deploying daemon mds.cephfs.compute-0.ymknms on compute-0
Jan 23 09:52:32 compute-2 ceph-mon[75771]: pgmap v50: 198 pgs: 1 unknown, 197 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:52:32 compute-2 ceph-mon[75771]: mds.? [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] up:boot
Jan 23 09:52:32 compute-2 ceph-mon[75771]: daemon mds.cephfs.compute-2.prgzmm assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Jan 23 09:52:32 compute-2 ceph-mon[75771]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Jan 23 09:52:32 compute-2 ceph-mon[75771]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Jan 23 09:52:32 compute-2 ceph-mon[75771]: Cluster is now healthy
Jan 23 09:52:32 compute-2 ceph-mon[75771]: fsmap cephfs:0 1 up:standby
Jan 23 09:52:32 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-2.prgzmm"}]: dispatch
Jan 23 09:52:32 compute-2 ceph-mon[75771]: fsmap cephfs:1 {0=cephfs.compute-2.prgzmm=up:creating}
Jan 23 09:52:32 compute-2 ceph-mon[75771]: daemon mds.cephfs.compute-2.prgzmm is now active in filesystem cephfs as rank 0
Jan 23 09:52:32 compute-2 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 23 09:52:32 compute-2 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 23 09:52:32 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/681468082' entity='client.rgw.rgw.compute-0.jbpfwf' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 23 09:52:32 compute-2 ceph-mon[75771]: osdmap e53: 3 total, 3 up, 3 in
Jan 23 09:52:32 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/681468082' entity='client.rgw.rgw.compute-0.jbpfwf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 09:52:32 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3935157835' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 09:52:32 compute-2 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 09:52:32 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1572426654' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 09:52:32 compute-2 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 09:52:32 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).mds e5 new map
Jan 23 09:52:32 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).mds e5 print_map
                                           e5
                                           btime 2026-01-23T09:52:32:417167+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-01-23T09:51:34.000760+0000
                                           modified        2026-01-23T09:52:32.417165+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24193}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 24193 members: 24193
                                           [mds.cephfs.compute-2.prgzmm{0:24193} state up:active seq 2 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
Jan 23 09:52:32 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm Updating MDS map to version 5 from mon.1
Jan 23 09:52:32 compute-2 ceph-mds[83039]: mds.0.4 handle_mds_map I am now mds.0.4
Jan 23 09:52:32 compute-2 ceph-mds[83039]: mds.0.4 handle_mds_map state change up:creating --> up:active
Jan 23 09:52:32 compute-2 ceph-mds[83039]: mds.0.4 recovery_done -- successful recovery!
Jan 23 09:52:32 compute-2 ceph-mds[83039]: mds.0.4 active_start
Jan 23 09:52:32 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e54 e54: 3 total, 3 up, 3 in
Jan 23 09:52:33 compute-2 ceph-mon[75771]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 23 09:52:33 compute-2 ceph-mon[75771]: mds.? [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] up:active
Jan 23 09:52:33 compute-2 ceph-mon[75771]: fsmap cephfs:1 {0=cephfs.compute-2.prgzmm=up:active}
Jan 23 09:52:33 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/681468082' entity='client.rgw.rgw.compute-0.jbpfwf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 23 09:52:33 compute-2 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 23 09:52:33 compute-2 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 23 09:52:33 compute-2 ceph-mon[75771]: osdmap e54: 3 total, 3 up, 3 in
Jan 23 09:52:33 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 09:52:33 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).mds e6 new map
Jan 23 09:52:33 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).mds e6 print_map
                                           e6
                                           btime 2026-01-23T09:52:33:487599+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-01-23T09:51:34.000760+0000
                                           modified        2026-01-23T09:52:32.417165+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24193}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24193 members: 24193
                                           [mds.cephfs.compute-2.prgzmm{0:24193} state up:active seq 2 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.ymknms{-1:14502} state up:standby seq 1 addr [v2:192.168.122.100:6806/3718923574,v1:192.168.122.100:6807/3718923574] compat {c=[1],r=[1],i=[1fff]}]
Jan 23 09:52:34 compute-2 radosgw[82185]: v1 topic migration: starting v1 topic migration..
Jan 23 09:52:34 compute-2 radosgw[82185]: LDAP not started since no server URIs were provided in the configuration.
Jan 23 09:52:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-rgw-rgw-compute-2-yzflfx[82181]: 2026-01-23T09:52:34.009+0000 7f834d527980 -1 LDAP not started since no server URIs were provided in the configuration.
Jan 23 09:52:34 compute-2 radosgw[82185]: v1 topic migration: finished v1 topic migration
Jan 23 09:52:34 compute-2 radosgw[82185]: framework: beast
Jan 23 09:52:34 compute-2 radosgw[82185]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Jan 23 09:52:34 compute-2 radosgw[82185]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Jan 23 09:52:34 compute-2 radosgw[82185]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 23 09:52:34 compute-2 radosgw[82185]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 23 09:52:34 compute-2 radosgw[82185]: starting handler: beast
Jan 23 09:52:34 compute-2 radosgw[82185]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 09:52:34 compute-2 radosgw[82185]: mgrc service_daemon_register rgw.24181 metadata {arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.102:8082,frontend_type#0=beast,hostname=compute-2,id=rgw.compute-2.yzflfx,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026,kernel_version=5.14.0-661.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864316,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=75d0a494-c738-4cca-b87e-be71cfd0ed45,zone_name=default,zonegroup_id=6635d7c3-d02c-4c4b-90b3-4ee042e293d6,zonegroup_name=default}
Jan 23 09:52:34 compute-2 radosgw[82185]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Jan 23 09:52:34 compute-2 radosgw[82185]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Jan 23 09:52:34 compute-2 radosgw[82185]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Jan 23 09:52:34 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e55 e55: 3 total, 3 up, 3 in
Jan 23 09:52:34 compute-2 radosgw[82185]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Jan 23 09:52:35 compute-2 ceph-mon[75771]: pgmap v53: 198 pgs: 1 unknown, 197 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:52:35 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:35 compute-2 ceph-mon[75771]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 23 09:52:35 compute-2 ceph-mon[75771]: Cluster is now healthy
Jan 23 09:52:35 compute-2 ceph-mon[75771]: mds.? [v2:192.168.122.100:6806/3718923574,v1:192.168.122.100:6807/3718923574] up:boot
Jan 23 09:52:35 compute-2 ceph-mon[75771]: fsmap cephfs:1 {0=cephfs.compute-2.prgzmm=up:active} 1 up:standby
Jan 23 09:52:35 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-0.ymknms"}]: dispatch
Jan 23 09:52:35 compute-2 radosgw[82185]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Jan 23 09:52:35 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e55 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:52:35 compute-2 radosgw[82185]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Jan 23 09:52:36 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e56 e56: 3 total, 3 up, 3 in
Jan 23 09:52:36 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]': finished
Jan 23 09:52:36 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:36 compute-2 ceph-mon[75771]: osdmap e55: 3 total, 3 up, 3 in
Jan 23 09:52:36 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 09:52:36 compute-2 ceph-mon[75771]: pgmap v55: 198 pgs: 198 active+clean; 456 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 1.6 KiB/s rd, 11 KiB/s wr, 41 op/s
Jan 23 09:52:36 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 09:52:36 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:36 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.bcvzvj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 23 09:52:36 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.bcvzvj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 23 09:52:36 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:52:36 compute-2 ceph-mon[75771]: Deploying daemon mds.cephfs.compute-1.bcvzvj on compute-1
Jan 23 09:52:37 compute-2 ceph-mds[83039]: mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Jan 23 09:52:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mds-cephfs-compute-2-prgzmm[83035]: 2026-01-23T09:52:37.079+0000 7f6dd35e5640 -1 mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Jan 23 09:52:37 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e57 e57: 3 total, 3 up, 3 in
Jan 23 09:52:37 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Jan 23 09:52:37 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 09:52:37 compute-2 ceph-mon[75771]: osdmap e56: 3 total, 3 up, 3 in
Jan 23 09:52:37 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 09:52:37 compute-2 ceph-mon[75771]: pgmap v57: 229 pgs: 31 unknown, 198 active+clean; 456 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 8.8 KiB/s wr, 34 op/s
Jan 23 09:52:37 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 09:52:37 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Jan 23 09:52:37 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 09:52:37 compute-2 ceph-mon[75771]: osdmap e57: 3 total, 3 up, 3 in
Jan 23 09:52:37 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 09:52:37 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:37 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:38 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e58 e58: 3 total, 3 up, 3 in
Jan 23 09:52:38 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).mds e7 new map
Jan 23 09:52:38 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).mds e7 print_map
                                           e7
                                           btime 2026-01-23T09:52:38:529421+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-01-23T09:51:34.000760+0000
                                           modified        2026-01-23T09:52:32.417165+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24193}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24193 members: 24193
                                           [mds.cephfs.compute-2.prgzmm{0:24193} state up:active seq 2 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.ymknms{-1:14502} state up:standby seq 1 addr [v2:192.168.122.100:6806/3718923574,v1:192.168.122.100:6807/3718923574] compat {c=[1],r=[1],i=[1fff]}]
                                           [mds.cephfs.compute-1.bcvzvj{-1:24200} state up:standby seq 1 addr [v2:192.168.122.101:6804/2199615937,v1:192.168.122.101:6805/2199615937] compat {c=[1],r=[1],i=[1fff]}]
Jan 23 09:52:39 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:39 compute-2 ceph-mon[75771]: 8.16 scrub starts
Jan 23 09:52:39 compute-2 ceph-mon[75771]: 8.16 scrub ok
Jan 23 09:52:39 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:39 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:39 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Jan 23 09:52:39 compute-2 ceph-mon[75771]: osdmap e58: 3 total, 3 up, 3 in
Jan 23 09:52:39 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 09:52:39 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:39 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.bawllm", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Jan 23 09:52:39 compute-2 ceph-mon[75771]: mds.? [v2:192.168.122.101:6804/2199615937,v1:192.168.122.101:6805/2199615937] up:boot
Jan 23 09:52:39 compute-2 ceph-mon[75771]: fsmap cephfs:1 {0=cephfs.compute-2.prgzmm=up:active} 2 up:standby
Jan 23 09:52:39 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-1.bcvzvj"}]: dispatch
Jan 23 09:52:40 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e59 e59: 3 total, 3 up, 3 in
Jan 23 09:52:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[9.16( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[9.17( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e59 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:52:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.16( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[9.3( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.2( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[9.8( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.9( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[9.5( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[9.1d( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.1c( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.f( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.d( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[9.b( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.a( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.11( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.3( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.b( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.15( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.6( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[9.7( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.5( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[9.18( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.1f( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[9.13( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[9.9( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.c( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-2 ceph-mon[75771]: Creating key for client.nfs.cephfs.0.0.compute-1.bawllm
Jan 23 09:52:40 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.bawllm", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Jan 23 09:52:40 compute-2 ceph-mon[75771]: pgmap v60: 260 pgs: 260 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 216 KiB/s rd, 0 B/s wr, 364 op/s
Jan 23 09:52:40 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 09:52:40 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 09:52:40 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 09:52:40 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 09:52:40 compute-2 ceph-mon[75771]: Ensuring nfs.cephfs.0 is in the ganesha grace table
Jan 23 09:52:40 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Jan 23 09:52:40 compute-2 ceph-mon[75771]: 8.14 scrub starts
Jan 23 09:52:40 compute-2 ceph-mon[75771]: 8.14 scrub ok
Jan 23 09:52:40 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Jan 23 09:52:40 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:40 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:52:40 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Jan 23 09:52:40 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Jan 23 09:52:40 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 09:52:40 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 09:52:40 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 09:52:40 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 09:52:40 compute-2 ceph-mon[75771]: osdmap e59: 3 total, 3 up, 3 in
Jan 23 09:52:40 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).mds e8 new map
Jan 23 09:52:40 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).mds e8 print_map
                                           e8
                                           btime 2026-01-23T09:52:40:798611+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        8
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-01-23T09:51:34.000760+0000
                                           modified        2026-01-23T09:52:39.805778+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24193}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24193 members: 24193
                                           [mds.cephfs.compute-2.prgzmm{0:24193} state up:active seq 4 join_fscid=1 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.ymknms{-1:14502} state up:standby seq 1 addr [v2:192.168.122.100:6806/3718923574,v1:192.168.122.100:6807/3718923574] compat {c=[1],r=[1],i=[1fff]}]
                                           [mds.cephfs.compute-1.bcvzvj{-1:24200} state up:standby seq 1 addr [v2:192.168.122.101:6804/2199615937,v1:192.168.122.101:6805/2199615937] compat {c=[1],r=[1],i=[1fff]}]
Jan 23 09:52:40 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm Updating MDS map to version 8 from mon.1
Jan 23 09:52:41 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e60 e60: 3 total, 3 up, 3 in
Jan 23 09:52:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[9.16( v 44'12 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[9.17( v 44'12 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.16( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[9.3( v 44'12 (0'0,44'12] local-lis/les=59/60 n=1 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.15( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.3( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[9.13( v 44'12 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[9.9( v 44'12 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[9.18( v 44'12 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.1f( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.c( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.5( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.6( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.2( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[9.7( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=44'12 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.11( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.b( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[9.b( v 44'12 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[9.1d( v 44'12 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[9.5( v 44'12 (0'0,44'12] local-lis/les=59/60 n=1 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.1c( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.9( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.f( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.d( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[9.8( v 44'12 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.a( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.16 scrub starts
Jan 23 09:52:41 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.16 scrub ok
Jan 23 09:52:41 compute-2 ceph-mon[75771]: 9.14 scrub starts
Jan 23 09:52:41 compute-2 ceph-mon[75771]: 9.14 scrub ok
Jan 23 09:52:41 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Jan 23 09:52:41 compute-2 ceph-mon[75771]: mds.? [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] up:active
Jan 23 09:52:41 compute-2 ceph-mon[75771]: fsmap cephfs:1 {0=cephfs.compute-2.prgzmm=up:active} 2 up:standby
Jan 23 09:52:41 compute-2 ceph-mon[75771]: pgmap v62: 322 pgs: 62 unknown, 260 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 206 KiB/s rd, 0 B/s wr, 347 op/s
Jan 23 09:52:41 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 09:52:41 compute-2 ceph-mon[75771]: osdmap e60: 3 total, 3 up, 3 in
Jan 23 09:52:41 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.bawllm-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 09:52:41 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.bawllm-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 09:52:41 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:52:42 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).mds e9 new map
Jan 23 09:52:42 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).mds e9 print_map
                                           e9
                                           btime 2026-01-23T09:52:42:200523+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        8
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-01-23T09:51:34.000760+0000
                                           modified        2026-01-23T09:52:39.805778+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24193}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24193 members: 24193
                                           [mds.cephfs.compute-2.prgzmm{0:24193} state up:active seq 4 join_fscid=1 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.ymknms{-1:14502} state up:standby seq 3 join_fscid=1 addr [v2:192.168.122.100:6806/3718923574,v1:192.168.122.100:6807/3718923574] compat {c=[1],r=[1],i=[1fff]}]
                                           [mds.cephfs.compute-1.bcvzvj{-1:24200} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/2199615937,v1:192.168.122.101:6805/2199615937] compat {c=[1],r=[1],i=[1fff]}]
Jan 23 09:52:42 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e61 e61: 3 total, 3 up, 3 in
Jan 23 09:52:42 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.17 scrub starts
Jan 23 09:52:42 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.17 scrub ok
Jan 23 09:52:43 compute-2 ceph-mon[75771]: Rados config object exists: conf-nfs.cephfs
Jan 23 09:52:43 compute-2 ceph-mon[75771]: Creating key for client.nfs.cephfs.0.0.compute-1.bawllm-rgw
Jan 23 09:52:43 compute-2 ceph-mon[75771]: 9.2 scrub starts
Jan 23 09:52:43 compute-2 ceph-mon[75771]: 9.2 scrub ok
Jan 23 09:52:43 compute-2 ceph-mon[75771]: Bind address in nfs.cephfs.0.0.compute-1.bawllm's ganesha conf is defaulting to empty
Jan 23 09:52:43 compute-2 ceph-mon[75771]: Deploying daemon nfs.cephfs.0.0.compute-1.bawllm on compute-1
Jan 23 09:52:43 compute-2 ceph-mon[75771]: 9.16 scrub starts
Jan 23 09:52:43 compute-2 ceph-mon[75771]: 9.16 scrub ok
Jan 23 09:52:43 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 09:52:43 compute-2 ceph-mon[75771]: osdmap e61: 3 total, 3 up, 3 in
Jan 23 09:52:43 compute-2 ceph-mon[75771]: mds.? [v2:192.168.122.100:6806/3718923574,v1:192.168.122.100:6807/3718923574] up:standby
Jan 23 09:52:43 compute-2 ceph-mon[75771]: mds.? [v2:192.168.122.101:6804/2199615937,v1:192.168.122.101:6805/2199615937] up:standby
Jan 23 09:52:43 compute-2 ceph-mon[75771]: fsmap cephfs:1 {0=cephfs.compute-2.prgzmm=up:active} 2 up:standby
Jan 23 09:52:43 compute-2 ceph-mon[75771]: 9.a scrub starts
Jan 23 09:52:43 compute-2 ceph-mon[75771]: 9.a scrub ok
Jan 23 09:52:43 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.3 scrub starts
Jan 23 09:52:43 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e62 e62: 3 total, 3 up, 3 in
Jan 23 09:52:43 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.3 scrub ok
Jan 23 09:52:44 compute-2 ceph-mon[75771]: 9.17 scrub starts
Jan 23 09:52:44 compute-2 ceph-mon[75771]: 9.17 scrub ok
Jan 23 09:52:44 compute-2 ceph-mon[75771]: pgmap v65: 353 pgs: 93 unknown, 260 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 211 KiB/s rd, 0 B/s wr, 356 op/s
Jan 23 09:52:44 compute-2 ceph-mon[75771]: 9.6 deep-scrub starts
Jan 23 09:52:44 compute-2 ceph-mon[75771]: 9.6 deep-scrub ok
Jan 23 09:52:44 compute-2 ceph-mon[75771]: osdmap e62: 3 total, 3 up, 3 in
Jan 23 09:52:44 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.15 scrub starts
Jan 23 09:52:44 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.15 scrub ok
Jan 23 09:52:45 compute-2 ceph-mon[75771]: 8.3 scrub starts
Jan 23 09:52:45 compute-2 ceph-mon[75771]: 8.3 scrub ok
Jan 23 09:52:45 compute-2 ceph-mon[75771]: 9.11 scrub starts
Jan 23 09:52:45 compute-2 ceph-mon[75771]: 9.11 scrub ok
Jan 23 09:52:45 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:45 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:45 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:45 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.tykohi", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Jan 23 09:52:45 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.tykohi", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Jan 23 09:52:45 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Jan 23 09:52:45 compute-2 ceph-mon[75771]: 8.15 scrub starts
Jan 23 09:52:45 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Jan 23 09:52:45 compute-2 ceph-mon[75771]: 8.15 scrub ok
Jan 23 09:52:45 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:52:45 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:45 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 09:52:45 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Jan 23 09:52:45 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 09:52:45 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:52:45 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.13 deep-scrub starts
Jan 23 09:52:45 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.13 deep-scrub ok
Jan 23 09:52:46 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e63 e63: 3 total, 3 up, 3 in
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[12.11( empty local-lis/les=0/0 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.17( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[12.13( empty local-lis/les=0/0 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.13( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.1( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[12.7( empty local-lis/les=0/0 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.9( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.d( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[12.9( empty local-lis/les=0/0 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.15( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.b( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.3( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[12.2( empty local-lis/les=0/0 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.5( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[12.3( empty local-lis/les=0/0 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[12.4( empty local-lis/les=0/0 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[12.1e( empty local-lis/les=0/0 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[12.1a( empty local-lis/les=0/0 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.1d( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[12.18( empty local-lis/les=0/0 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.1f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[12.17( empty local-lis/les=0/0 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.19( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.1b( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.7( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.9 scrub starts
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.11( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[12.1d( empty local-lis/les=0/0 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[11.16( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[11.13( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[11.17( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[11.e( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[11.8( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[11.a( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[11.3( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[11.19( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.9 scrub ok
Jan 23 09:52:46 compute-2 ceph-mon[75771]: Creating key for client.nfs.cephfs.1.0.compute-2.tykohi
Jan 23 09:52:46 compute-2 ceph-mon[75771]: Ensuring nfs.cephfs.1 is in the ganesha grace table
Jan 23 09:52:46 compute-2 ceph-mon[75771]: pgmap v67: 353 pgs: 353 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 28 KiB/s rd, 0 B/s wr, 57 op/s; 105 B/s, 0 objects/s recovering
Jan 23 09:52:46 compute-2 ceph-mon[75771]: 9.f scrub starts
Jan 23 09:52:46 compute-2 ceph-mon[75771]: 9.f scrub ok
Jan 23 09:52:47 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e64 e64: 3 total, 3 up, 3 in
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.1f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.1f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.1d( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.1d( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.1b( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.1b( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.19( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.7( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.7( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.19( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.17( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.5( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.17( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.5( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.9( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.9( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.1( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.1( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.13( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.13( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.d( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.d( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.11( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.11( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.b( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.b( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.3( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.3( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.15( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.15( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[11.a( v 51'48 (0'0,51'48] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[11.3( v 60'51 lc 50'38 (0'0,60'51] local-lis/les=63/64 n=1 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=60'51 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[11.13( v 51'48 (0'0,51'48] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[11.e( v 60'51 lc 50'26 (0'0,60'51] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=60'51 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[11.16( v 51'48 (0'0,51'48] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[11.8( v 51'48 (0'0,51'48] local-lis/les=63/64 n=1 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[11.17( v 51'48 (0'0,51'48] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[11.19( v 51'48 (0'0,51'48] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[12.1d( empty local-lis/les=63/64 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[12.1a( empty local-lis/les=63/64 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[12.1e( empty local-lis/les=63/64 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[12.3( empty local-lis/les=63/64 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[12.2( empty local-lis/les=63/64 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[12.11( empty local-lis/les=63/64 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[12.7( empty local-lis/les=63/64 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[12.9( empty local-lis/les=63/64 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[12.4( empty local-lis/les=63/64 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[12.17( empty local-lis/les=63/64 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[12.18( empty local-lis/les=63/64 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[12.13( empty local-lis/les=63/64 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.18 scrub starts
Jan 23 09:52:47 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.18 scrub ok
Jan 23 09:52:47 compute-2 ceph-mon[75771]: 9.13 deep-scrub starts
Jan 23 09:52:47 compute-2 ceph-mon[75771]: 9.13 deep-scrub ok
Jan 23 09:52:47 compute-2 ceph-mon[75771]: 8.10 deep-scrub starts
Jan 23 09:52:47 compute-2 ceph-mon[75771]: 8.10 deep-scrub ok
Jan 23 09:52:47 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 09:52:47 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Jan 23 09:52:47 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 09:52:47 compute-2 ceph-mon[75771]: osdmap e63: 3 total, 3 up, 3 in
Jan 23 09:52:47 compute-2 ceph-mon[75771]: 9.9 scrub starts
Jan 23 09:52:47 compute-2 ceph-mon[75771]: 9.9 scrub ok
Jan 23 09:52:47 compute-2 ceph-mon[75771]: pgmap v69: 353 pgs: 353 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 23 KiB/s rd, 0 B/s wr, 49 op/s; 260 B/s, 1 objects/s recovering
Jan 23 09:52:47 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Jan 23 09:52:47 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Jan 23 09:52:47 compute-2 ceph-mon[75771]: osdmap e64: 3 total, 3 up, 3 in
Jan 23 09:52:48 compute-2 sudo[83106]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:52:48 compute-2 sudo[83106]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:52:48 compute-2 sudo[83106]: pam_unix(sudo:session): session closed for user root
Jan 23 09:52:48 compute-2 sudo[83131]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:52:48 compute-2 sudo[83131]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:52:48 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e65 e65: 3 total, 3 up, 3 in
Jan 23 09:52:48 compute-2 ceph-mon[75771]: 11.15 scrub starts
Jan 23 09:52:48 compute-2 ceph-mon[75771]: 11.15 scrub ok
Jan 23 09:52:48 compute-2 ceph-mon[75771]: 9.d scrub starts
Jan 23 09:52:48 compute-2 ceph-mon[75771]: 9.d scrub ok
Jan 23 09:52:48 compute-2 ceph-mon[75771]: 9.18 scrub starts
Jan 23 09:52:48 compute-2 ceph-mon[75771]: 9.18 scrub ok
Jan 23 09:52:48 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Jan 23 09:52:48 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Jan 23 09:52:48 compute-2 ceph-mon[75771]: Rados config object exists: conf-nfs.cephfs
Jan 23 09:52:48 compute-2 ceph-mon[75771]: Creating key for client.nfs.cephfs.1.0.compute-2.tykohi-rgw
Jan 23 09:52:48 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.tykohi-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 09:52:48 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.tykohi-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 09:52:48 compute-2 ceph-mon[75771]: Bind address in nfs.cephfs.1.0.compute-2.tykohi's ganesha conf is defaulting to empty
Jan 23 09:52:48 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:52:48 compute-2 ceph-mon[75771]: Deploying daemon nfs.cephfs.1.0.compute-2.tykohi on compute-2
Jan 23 09:52:48 compute-2 ceph-mon[75771]: osdmap e65: 3 total, 3 up, 3 in
Jan 23 09:52:49 compute-2 podman[83196]: 2026-01-23 09:52:49.586198175 +0000 UTC m=+0.063662444 container create ff1841fbb3ce48b7c39f1e29f26c7d0c10e4dcaa4f9e7f89f7a925aa8a4705cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_albattani, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1)
Jan 23 09:52:49 compute-2 systemd[1]: Started libpod-conmon-ff1841fbb3ce48b7c39f1e29f26c7d0c10e4dcaa4f9e7f89f7a925aa8a4705cc.scope.
Jan 23 09:52:49 compute-2 podman[83196]: 2026-01-23 09:52:49.565456892 +0000 UTC m=+0.042921181 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:52:49 compute-2 systemd[1]: Started libcrun container.
Jan 23 09:52:49 compute-2 podman[83196]: 2026-01-23 09:52:49.69498555 +0000 UTC m=+0.172449839 container init ff1841fbb3ce48b7c39f1e29f26c7d0c10e4dcaa4f9e7f89f7a925aa8a4705cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_albattani, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Jan 23 09:52:49 compute-2 podman[83196]: 2026-01-23 09:52:49.701372151 +0000 UTC m=+0.178836420 container start ff1841fbb3ce48b7c39f1e29f26c7d0c10e4dcaa4f9e7f89f7a925aa8a4705cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_albattani, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Jan 23 09:52:49 compute-2 upbeat_albattani[83212]: 167 167
Jan 23 09:52:49 compute-2 systemd[1]: libpod-ff1841fbb3ce48b7c39f1e29f26c7d0c10e4dcaa4f9e7f89f7a925aa8a4705cc.scope: Deactivated successfully.
Jan 23 09:52:49 compute-2 conmon[83212]: conmon ff1841fbb3ce48b7c39f <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-ff1841fbb3ce48b7c39f1e29f26c7d0c10e4dcaa4f9e7f89f7a925aa8a4705cc.scope/container/memory.events
Jan 23 09:52:49 compute-2 podman[83196]: 2026-01-23 09:52:49.710081198 +0000 UTC m=+0.187545467 container attach ff1841fbb3ce48b7c39f1e29f26c7d0c10e4dcaa4f9e7f89f7a925aa8a4705cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_albattani, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 09:52:49 compute-2 podman[83196]: 2026-01-23 09:52:49.710513619 +0000 UTC m=+0.187977888 container died ff1841fbb3ce48b7c39f1e29f26c7d0c10e4dcaa4f9e7f89f7a925aa8a4705cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_albattani, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 09:52:49 compute-2 systemd[1]: var-lib-containers-storage-overlay-e08e1c1620331f43b94956035d7db3bc164be881adfcde6834723e5e11a23a0a-merged.mount: Deactivated successfully.
Jan 23 09:52:49 compute-2 podman[83196]: 2026-01-23 09:52:49.756064311 +0000 UTC m=+0.233528580 container remove ff1841fbb3ce48b7c39f1e29f26c7d0c10e4dcaa4f9e7f89f7a925aa8a4705cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_albattani, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 09:52:49 compute-2 systemd[1]: libpod-conmon-ff1841fbb3ce48b7c39f1e29f26c7d0c10e4dcaa4f9e7f89f7a925aa8a4705cc.scope: Deactivated successfully.
Jan 23 09:52:49 compute-2 systemd[1]: Reloading.
Jan 23 09:52:49 compute-2 systemd-rc-local-generator[83256]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:52:49 compute-2 systemd-sysv-generator[83259]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:52:50 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e66 e66: 3 total, 3 up, 3 in
Jan 23 09:52:50 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=0/0 n=4 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 luod=0'0 crt=60'756 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:50 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=0/0 n=4 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=60'756 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:50 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=0/0 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 luod=0'0 crt=62'759 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:50 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=0/0 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'759 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:50 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.5( v 65'770 (0'0,65'770] local-lis/les=0/0 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 luod=0'0 crt=62'766 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:50 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.5( v 65'770 (0'0,65'770] local-lis/les=0/0 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'766 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:50 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=0/0 n=4 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 luod=0'0 crt=58'754 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:50 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=0/0 n=4 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:50 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.1( v 62'768 (0'0,62'768] local-lis/les=0/0 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 luod=0'0 crt=62'768 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:50 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.1( v 62'768 (0'0,62'768] local-lis/les=0/0 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'768 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:50 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=0/0 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 luod=0'0 crt=62'759 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:50 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=0/0 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 luod=0'0 crt=62'759 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:50 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=0/0 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'759 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:50 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=0/0 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'759 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:50 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=0/0 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 luod=0'0 crt=62'771 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:50 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=0/0 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'771 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:50 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=0/0 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 luod=0'0 crt=62'759 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:50 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=0/0 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'759 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:50 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.11( v 62'769 (0'0,62'769] local-lis/les=0/0 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 luod=0'0 crt=62'769 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:50 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.11( v 62'769 (0'0,62'769] local-lis/les=0/0 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'769 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:50 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.3( v 62'765 (0'0,62'765] local-lis/les=0/0 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 luod=0'0 crt=62'765 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:50 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.3( v 62'765 (0'0,62'765] local-lis/les=0/0 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'765 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:50 compute-2 ceph-mon[75771]: 8.e scrub starts
Jan 23 09:52:50 compute-2 ceph-mon[75771]: 8.e scrub ok
Jan 23 09:52:50 compute-2 ceph-mon[75771]: 9.10 scrub starts
Jan 23 09:52:50 compute-2 ceph-mon[75771]: 9.10 scrub ok
Jan 23 09:52:50 compute-2 ceph-mon[75771]: pgmap v72: 353 pgs: 1 active+recovering+remapped, 1 active+remapped, 8 remapped+peering, 14 active+recovery_wait+remapped, 329 active+clean; 456 KiB data, 121 MiB used, 60 GiB / 60 GiB avail; 58 KiB/s rd, 1.3 KiB/s wr, 111 op/s; 80/223 objects misplaced (35.874%); 227 B/s, 1 objects/s recovering
Jan 23 09:52:50 compute-2 ceph-mon[75771]: 9.c scrub starts
Jan 23 09:52:50 compute-2 ceph-mon[75771]: 9.c scrub ok
Jan 23 09:52:50 compute-2 systemd[1]: Reloading.
Jan 23 09:52:50 compute-2 systemd-rc-local-generator[83296]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:52:50 compute-2 systemd-sysv-generator[83300]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:52:50 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:52:50 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e66 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:52:50 compute-2 podman[83351]: 2026-01-23 09:52:50.63684899 +0000 UTC m=+0.034766827 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:52:50 compute-2 podman[83351]: 2026-01-23 09:52:50.734288286 +0000 UTC m=+0.132206103 container create 25dd11a4fcdbe97d844db7d4fb971576b159944b07e8d21ef2be7d36d99ebd7d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:52:50 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d0f8416f7052c607630e33d06ff2a3ec2436d092e8cfcdd013926939d221c79/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:50 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d0f8416f7052c607630e33d06ff2a3ec2436d092e8cfcdd013926939d221c79/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:50 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d0f8416f7052c607630e33d06ff2a3ec2436d092e8cfcdd013926939d221c79/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:50 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d0f8416f7052c607630e33d06ff2a3ec2436d092e8cfcdd013926939d221c79/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.tykohi-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:50 compute-2 podman[83351]: 2026-01-23 09:52:50.811589363 +0000 UTC m=+0.209507200 container init 25dd11a4fcdbe97d844db7d4fb971576b159944b07e8d21ef2be7d36d99ebd7d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 09:52:50 compute-2 podman[83351]: 2026-01-23 09:52:50.816870638 +0000 UTC m=+0.214788455 container start 25dd11a4fcdbe97d844db7d4fb971576b159944b07e8d21ef2be7d36d99ebd7d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:52:50 compute-2 bash[83351]: 25dd11a4fcdbe97d844db7d4fb971576b159944b07e8d21ef2be7d36d99ebd7d
Jan 23 09:52:50 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:52:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:50 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 09:52:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:50 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 09:52:50 compute-2 sudo[83131]: pam_unix(sudo:session): session closed for user root
Jan 23 09:52:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:50 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 09:52:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:50 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 09:52:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:50 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 09:52:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:50 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 09:52:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:50 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 09:52:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:50 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:52:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:51 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Jan 23 09:52:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:51 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Jan 23 09:52:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:51 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 09:52:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:51 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:52:51 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e67 e67: 3 total, 3 up, 3 in
Jan 23 09:52:51 compute-2 ceph-mon[75771]: osdmap e66: 3 total, 3 up, 3 in
Jan 23 09:52:51 compute-2 ceph-mon[75771]: 8.1 deep-scrub starts
Jan 23 09:52:51 compute-2 ceph-mon[75771]: 8.1 deep-scrub ok
Jan 23 09:52:51 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:51 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:51 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:51 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.fenqiu", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Jan 23 09:52:51 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.fenqiu", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Jan 23 09:52:51 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Jan 23 09:52:51 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=0/0 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 luod=0'0 crt=62'761 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:51 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=0/0 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 crt=62'761 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:51 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=0/0 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 luod=0'0 crt=62'764 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:51 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=0/0 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 crt=62'764 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:51 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=0/0 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 luod=0'0 crt=62'771 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:51 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=0/0 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 crt=62'771 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:51 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=0/0 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 luod=0'0 crt=62'768 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:51 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=0/0 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 crt=62'768 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:51 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=0/0 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 luod=0'0 crt=62'759 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:51 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=0/0 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 crt=62'759 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:51 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'759 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:51 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=66/67 n=4 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=60'756 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:51 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.5( v 65'770 (0'0,65'770] local-lis/les=66/67 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=65'770 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:51 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=66/67 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'759 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:51 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=66/67 n=4 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=58'754 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:51 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'759 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:51 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.1( v 62'768 (0'0,62'768] local-lis/les=66/67 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'768 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:51 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=66/67 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'771 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:51 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.11( v 62'769 (0'0,62'769] local-lis/les=66/67 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'769 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:51 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'759 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:51 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.3( v 62'765 (0'0,62'765] local-lis/les=66/67 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'765 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:51 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.17 scrub starts
Jan 23 09:52:51 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.17 scrub ok
Jan 23 09:52:52 compute-2 ceph-mon[75771]: Creating key for client.nfs.cephfs.2.0.compute-0.fenqiu
Jan 23 09:52:52 compute-2 ceph-mon[75771]: Ensuring nfs.cephfs.2 is in the ganesha grace table
Jan 23 09:52:52 compute-2 ceph-mon[75771]: pgmap v74: 353 pgs: 11 peering, 8 remapped+peering, 5 active+recovery_wait+remapped, 329 active+clean; 456 KiB data, 121 MiB used, 60 GiB / 60 GiB avail; 39 KiB/s rd, 1.6 KiB/s wr, 69 op/s; 29/223 objects misplaced (13.004%); 240 B/s, 13 objects/s recovering
Jan 23 09:52:52 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Jan 23 09:52:52 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:52:52 compute-2 ceph-mon[75771]: osdmap e67: 3 total, 3 up, 3 in
Jan 23 09:52:52 compute-2 ceph-mon[75771]: 9.0 scrub starts
Jan 23 09:52:52 compute-2 ceph-mon[75771]: 9.0 scrub ok
Jan 23 09:52:52 compute-2 ceph-mon[75771]: 10.17 scrub starts
Jan 23 09:52:52 compute-2 ceph-mon[75771]: 10.17 scrub ok
Jan 23 09:52:52 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e68 e68: 3 total, 3 up, 3 in
Jan 23 09:52:52 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 68 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=67/68 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 crt=62'761 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:52 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 68 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=67/68 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 crt=62'764 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:52 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 68 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=67/68 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 crt=62'768 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:52 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 68 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=67/68 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 crt=62'759 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:52 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 68 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=67/68 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 crt=62'771 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:52 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.7 scrub starts
Jan 23 09:52:52 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.7 scrub ok
Jan 23 09:52:53 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.5 scrub starts
Jan 23 09:52:53 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.5 scrub ok
Jan 23 09:52:53 compute-2 ceph-mon[75771]: 9.1 deep-scrub starts
Jan 23 09:52:53 compute-2 ceph-mon[75771]: 9.1 deep-scrub ok
Jan 23 09:52:53 compute-2 ceph-mon[75771]: osdmap e68: 3 total, 3 up, 3 in
Jan 23 09:52:53 compute-2 ceph-mon[75771]: 10.7 scrub starts
Jan 23 09:52:53 compute-2 ceph-mon[75771]: 10.7 scrub ok
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000002:nfs.cephfs.1: -2
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 09:52:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 09:52:54 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.3 scrub starts
Jan 23 09:52:54 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.3 scrub ok
Jan 23 09:52:54 compute-2 ceph-mon[75771]: pgmap v77: 353 pgs: 11 peering, 8 remapped+peering, 5 active+recovery_wait+remapped, 329 active+clean; 456 KiB data, 121 MiB used, 60 GiB / 60 GiB avail; 38 KiB/s rd, 1.6 KiB/s wr, 68 op/s; 29/223 objects misplaced (13.004%); 239 B/s, 12 objects/s recovering
Jan 23 09:52:54 compute-2 ceph-mon[75771]: 8.0 scrub starts
Jan 23 09:52:54 compute-2 ceph-mon[75771]: 8.0 scrub ok
Jan 23 09:52:54 compute-2 ceph-mon[75771]: 10.5 scrub starts
Jan 23 09:52:54 compute-2 ceph-mon[75771]: 10.5 scrub ok
Jan 23 09:52:54 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Jan 23 09:52:54 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Jan 23 09:52:54 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.fenqiu-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 09:52:54 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.fenqiu-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 09:52:54 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:52:55 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.1f scrub starts
Jan 23 09:52:55 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.1f scrub ok
Jan 23 09:52:55 compute-2 ceph-mon[75771]: 8.7 scrub starts
Jan 23 09:52:55 compute-2 ceph-mon[75771]: 8.7 scrub ok
Jan 23 09:52:55 compute-2 ceph-mon[75771]: Rados config object exists: conf-nfs.cephfs
Jan 23 09:52:55 compute-2 ceph-mon[75771]: Creating key for client.nfs.cephfs.2.0.compute-0.fenqiu-rgw
Jan 23 09:52:55 compute-2 ceph-mon[75771]: Bind address in nfs.cephfs.2.0.compute-0.fenqiu's ganesha conf is defaulting to empty
Jan 23 09:52:55 compute-2 ceph-mon[75771]: Deploying daemon nfs.cephfs.2.0.compute-0.fenqiu on compute-0
Jan 23 09:52:55 compute-2 ceph-mon[75771]: 9.3 scrub starts
Jan 23 09:52:55 compute-2 ceph-mon[75771]: 9.3 scrub ok
Jan 23 09:52:55 compute-2 ceph-mon[75771]: 8.17 deep-scrub starts
Jan 23 09:52:55 compute-2 ceph-mon[75771]: 8.17 deep-scrub ok
Jan 23 09:52:55 compute-2 ceph-mon[75771]: pgmap v78: 353 pgs: 11 peering, 5 active+recovery_wait+remapped, 337 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 2.8 KiB/s rd, 1.3 KiB/s wr, 3 op/s; 29/222 objects misplaced (13.063%); 318 B/s, 15 objects/s recovering
Jan 23 09:52:55 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e68 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:52:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:56 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:52:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:56 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 09:52:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:56 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:52:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:56 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:52:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:56 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:52:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:56 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 09:52:56 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.5 deep-scrub starts
Jan 23 09:52:56 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.5 deep-scrub ok
Jan 23 09:52:56 compute-2 ceph-mon[75771]: 9.4 deep-scrub starts
Jan 23 09:52:56 compute-2 ceph-mon[75771]: 9.4 deep-scrub ok
Jan 23 09:52:56 compute-2 ceph-mon[75771]: 8.1f scrub starts
Jan 23 09:52:56 compute-2 ceph-mon[75771]: 8.1f scrub ok
Jan 23 09:52:56 compute-2 ceph-mon[75771]: 9.e scrub starts
Jan 23 09:52:56 compute-2 ceph-mon[75771]: 9.e scrub ok
Jan 23 09:52:56 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:56 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:56 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:56 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:56 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:57 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.2 deep-scrub starts
Jan 23 09:52:57 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.2 deep-scrub ok
Jan 23 09:52:57 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e69 e69: 3 total, 3 up, 3 in
Jan 23 09:52:57 compute-2 ceph-mon[75771]: 9.1a scrub starts
Jan 23 09:52:57 compute-2 ceph-mon[75771]: 9.1a scrub ok
Jan 23 09:52:57 compute-2 ceph-mon[75771]: 8.5 deep-scrub starts
Jan 23 09:52:57 compute-2 ceph-mon[75771]: 8.5 deep-scrub ok
Jan 23 09:52:57 compute-2 ceph-mon[75771]: Deploying daemon haproxy.nfs.cephfs.compute-1.mnxlgm on compute-1
Jan 23 09:52:57 compute-2 ceph-mon[75771]: 9.15 scrub starts
Jan 23 09:52:57 compute-2 ceph-mon[75771]: 9.15 scrub ok
Jan 23 09:52:57 compute-2 ceph-mon[75771]: pgmap v79: 353 pgs: 353 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 296 B/s, 15 objects/s recovering
Jan 23 09:52:57 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Jan 23 09:52:58 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.6 scrub starts
Jan 23 09:52:58 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.6 scrub ok
Jan 23 09:52:58 compute-2 ceph-mon[75771]: 9.1b scrub starts
Jan 23 09:52:58 compute-2 ceph-mon[75771]: 9.1b scrub ok
Jan 23 09:52:58 compute-2 ceph-mon[75771]: 8.2 deep-scrub starts
Jan 23 09:52:58 compute-2 ceph-mon[75771]: 8.2 deep-scrub ok
Jan 23 09:52:58 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Jan 23 09:52:58 compute-2 ceph-mon[75771]: osdmap e69: 3 total, 3 up, 3 in
Jan 23 09:52:58 compute-2 ceph-mon[75771]: 8.1b scrub starts
Jan 23 09:52:58 compute-2 ceph-mon[75771]: 8.1b scrub ok
Jan 23 09:52:59 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.11 scrub starts
Jan 23 09:52:59 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.11 scrub ok
Jan 23 09:52:59 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e70 e70: 3 total, 3 up, 3 in
Jan 23 09:52:59 compute-2 ceph-mon[75771]: 8.1a scrub starts
Jan 23 09:52:59 compute-2 ceph-mon[75771]: 8.1a scrub ok
Jan 23 09:52:59 compute-2 ceph-mon[75771]: 8.6 scrub starts
Jan 23 09:52:59 compute-2 ceph-mon[75771]: 8.6 scrub ok
Jan 23 09:52:59 compute-2 ceph-mon[75771]: 8.4 scrub starts
Jan 23 09:52:59 compute-2 ceph-mon[75771]: 8.4 scrub ok
Jan 23 09:52:59 compute-2 ceph-mon[75771]: pgmap v81: 353 pgs: 353 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 92 B/s, 6 objects/s recovering
Jan 23 09:52:59 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Jan 23 09:53:00 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 70 pg[10.14( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=70) [2] r=0 lpr=70 pi=[59,70)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:00 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 70 pg[10.4( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=70) [2] r=0 lpr=70 pi=[59,70)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:00 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 70 pg[10.1c( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=70) [2] r=0 lpr=70 pi=[59,70)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:00 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 70 pg[10.c( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=70) [2] r=0 lpr=70 pi=[59,70)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:00 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.b scrub starts
Jan 23 09:53:00 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.b scrub ok
Jan 23 09:53:00 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e70 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:53:01 compute-2 ceph-mon[75771]: 9.19 scrub starts
Jan 23 09:53:01 compute-2 ceph-mon[75771]: 9.19 scrub ok
Jan 23 09:53:01 compute-2 ceph-mon[75771]: 8.11 scrub starts
Jan 23 09:53:01 compute-2 ceph-mon[75771]: 8.11 scrub ok
Jan 23 09:53:01 compute-2 ceph-mon[75771]: 8.18 scrub starts
Jan 23 09:53:01 compute-2 ceph-mon[75771]: 8.18 scrub ok
Jan 23 09:53:01 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Jan 23 09:53:01 compute-2 ceph-mon[75771]: osdmap e70: 3 total, 3 up, 3 in
Jan 23 09:53:01 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:01 compute-2 ceph-mon[75771]: 8.12 scrub starts
Jan 23 09:53:01 compute-2 ceph-mon[75771]: 8.12 scrub ok
Jan 23 09:53:01 compute-2 anacron[2783]: Job `cron.weekly' started
Jan 23 09:53:01 compute-2 anacron[2783]: Job `cron.weekly' terminated
Jan 23 09:53:01 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e71 e71: 3 total, 3 up, 3 in
Jan 23 09:53:01 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 71 pg[10.1c( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=-1 lpr=71 pi=[59,71)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:01 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 71 pg[10.1c( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=-1 lpr=71 pi=[59,71)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:01 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 71 pg[10.4( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=-1 lpr=71 pi=[59,71)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:01 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 71 pg[10.4( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=-1 lpr=71 pi=[59,71)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:01 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 71 pg[10.c( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=-1 lpr=71 pi=[59,71)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:01 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 71 pg[10.c( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=-1 lpr=71 pi=[59,71)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:01 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 71 pg[10.14( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=-1 lpr=71 pi=[59,71)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:01 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 71 pg[10.14( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=-1 lpr=71 pi=[59,71)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:01 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.c scrub starts
Jan 23 09:53:01 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.c scrub ok
Jan 23 09:53:02 compute-2 ceph-mon[75771]: 9.1e scrub starts
Jan 23 09:53:02 compute-2 ceph-mon[75771]: 9.1e scrub ok
Jan 23 09:53:02 compute-2 ceph-mon[75771]: 8.b scrub starts
Jan 23 09:53:02 compute-2 ceph-mon[75771]: 8.b scrub ok
Jan 23 09:53:02 compute-2 ceph-mon[75771]: pgmap v83: 353 pgs: 4 unknown, 349 active+clean; 455 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 90 B/s, 6 objects/s recovering
Jan 23 09:53:02 compute-2 ceph-mon[75771]: osdmap e71: 3 total, 3 up, 3 in
Jan 23 09:53:02 compute-2 ceph-mon[75771]: 9.1f scrub starts
Jan 23 09:53:02 compute-2 ceph-mon[75771]: 9.1f scrub ok
Jan 23 09:53:02 compute-2 ceph-mon[75771]: 9.12 scrub starts
Jan 23 09:53:02 compute-2 ceph-mon[75771]: 9.12 scrub ok
Jan 23 09:53:02 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.5 deep-scrub starts
Jan 23 09:53:02 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.5 deep-scrub ok
Jan 23 09:53:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:02 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6854000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:02 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e72 e72: 3 total, 3 up, 3 in
Jan 23 09:53:03 compute-2 ceph-mon[75771]: 8.c scrub starts
Jan 23 09:53:03 compute-2 ceph-mon[75771]: 8.c scrub ok
Jan 23 09:53:03 compute-2 ceph-mon[75771]: 8.1e scrub starts
Jan 23 09:53:03 compute-2 ceph-mon[75771]: 8.1e scrub ok
Jan 23 09:53:03 compute-2 ceph-mon[75771]: 9.5 deep-scrub starts
Jan 23 09:53:03 compute-2 ceph-mon[75771]: 9.5 deep-scrub ok
Jan 23 09:53:03 compute-2 ceph-mon[75771]: 8.19 scrub starts
Jan 23 09:53:03 compute-2 ceph-mon[75771]: 8.19 scrub ok
Jan 23 09:53:03 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:03 compute-2 ceph-mon[75771]: osdmap e72: 3 total, 3 up, 3 in
Jan 23 09:53:03 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:03 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:03 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.1d scrub starts
Jan 23 09:53:03 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.1d scrub ok
Jan 23 09:53:03 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e73 e73: 3 total, 3 up, 3 in
Jan 23 09:53:03 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 73 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=0/0 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73) [2] r=0 lpr=73 pi=[59,73)/1 luod=0'0 crt=62'764 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:03 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 73 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=0/0 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73) [2] r=0 lpr=73 pi=[59,73)/1 crt=62'764 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:03 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 73 pg[10.4( v 72'768 (0'0,72'768] local-lis/les=0/0 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73) [2] r=0 lpr=73 pi=[59,73)/1 luod=0'0 crt=62'764 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:03 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 73 pg[10.4( v 72'768 (0'0,72'768] local-lis/les=0/0 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73) [2] r=0 lpr=73 pi=[59,73)/1 crt=62'764 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:03 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 73 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=0/0 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73) [2] r=0 lpr=73 pi=[59,73)/1 luod=0'0 crt=62'764 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:03 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 73 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=0/0 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73) [2] r=0 lpr=73 pi=[59,73)/1 crt=62'764 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:03 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 73 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=0/0 n=7 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73) [2] r=0 lpr=73 pi=[59,73)/1 luod=0'0 crt=62'771 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:03 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 73 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=0/0 n=7 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73) [2] r=0 lpr=73 pi=[59,73)/1 crt=62'771 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:04 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.1c scrub starts
Jan 23 09:53:04 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.1c scrub ok
Jan 23 09:53:04 compute-2 ceph-mon[75771]: Deploying daemon haproxy.nfs.cephfs.compute-0.yeogal on compute-0
Jan 23 09:53:04 compute-2 ceph-mon[75771]: pgmap v86: 353 pgs: 4 unknown, 349 active+clean; 455 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:04 compute-2 ceph-mon[75771]: 9.1c scrub starts
Jan 23 09:53:04 compute-2 ceph-mon[75771]: 9.1c scrub ok
Jan 23 09:53:04 compute-2 ceph-mon[75771]: 9.1d scrub starts
Jan 23 09:53:04 compute-2 ceph-mon[75771]: 9.1d scrub ok
Jan 23 09:53:04 compute-2 ceph-mon[75771]: osdmap e73: 3 total, 3 up, 3 in
Jan 23 09:53:04 compute-2 ceph-mon[75771]: 8.8 scrub starts
Jan 23 09:53:04 compute-2 ceph-mon[75771]: 8.8 scrub ok
Jan 23 09:53:04 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e74 e74: 3 total, 3 up, 3 in
Jan 23 09:53:04 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 74 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=73/74 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73) [2] r=0 lpr=73 pi=[59,73)/1 crt=62'764 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:04 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 74 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=73/74 n=7 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73) [2] r=0 lpr=73 pi=[59,73)/1 crt=62'771 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:04 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 74 pg[10.4( v 72'768 (0'0,72'768] local-lis/les=73/74 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73) [2] r=0 lpr=73 pi=[59,73)/1 crt=72'768 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:04 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 74 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=73/74 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73) [2] r=0 lpr=73 pi=[59,73)/1 crt=62'764 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:04 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6840001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:05 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.b deep-scrub starts
Jan 23 09:53:05 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.b deep-scrub ok
Jan 23 09:53:05 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e74 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:53:05 compute-2 ceph-mon[75771]: 8.1d scrub starts
Jan 23 09:53:05 compute-2 ceph-mon[75771]: 8.1d scrub ok
Jan 23 09:53:05 compute-2 ceph-mon[75771]: 8.1c scrub starts
Jan 23 09:53:05 compute-2 ceph-mon[75771]: 8.1c scrub ok
Jan 23 09:53:05 compute-2 ceph-mon[75771]: 12.15 deep-scrub starts
Jan 23 09:53:05 compute-2 ceph-mon[75771]: 12.15 deep-scrub ok
Jan 23 09:53:05 compute-2 ceph-mon[75771]: osdmap e74: 3 total, 3 up, 3 in
Jan 23 09:53:05 compute-2 ceph-mon[75771]: pgmap v89: 353 pgs: 4 unknown, 349 active+clean; 455 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:05 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:06 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.f scrub starts
Jan 23 09:53:06 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.f scrub ok
Jan 23 09:53:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:06 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:06 compute-2 ceph-mon[75771]: 8.13 scrub starts
Jan 23 09:53:06 compute-2 ceph-mon[75771]: 8.13 scrub ok
Jan 23 09:53:06 compute-2 ceph-mon[75771]: 9.b deep-scrub starts
Jan 23 09:53:06 compute-2 ceph-mon[75771]: 9.b deep-scrub ok
Jan 23 09:53:06 compute-2 ceph-mon[75771]: 12.f scrub starts
Jan 23 09:53:06 compute-2 ceph-mon[75771]: 12.f scrub ok
Jan 23 09:53:06 compute-2 ceph-mon[75771]: 12.d scrub starts
Jan 23 09:53:06 compute-2 ceph-mon[75771]: 12.d scrub ok
Jan 23 09:53:07 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.9 scrub starts
Jan 23 09:53:07 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.9 scrub ok
Jan 23 09:53:08 compute-2 ceph-mon[75771]: 11.0 scrub starts
Jan 23 09:53:08 compute-2 ceph-mon[75771]: 11.0 scrub ok
Jan 23 09:53:08 compute-2 ceph-mon[75771]: 8.f scrub starts
Jan 23 09:53:08 compute-2 ceph-mon[75771]: 8.f scrub ok
Jan 23 09:53:08 compute-2 ceph-mon[75771]: pgmap v90: 353 pgs: 353 active+clean; 455 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 29 KiB/s rd, 531 B/s wr, 53 op/s; 80 B/s, 4 objects/s recovering
Jan 23 09:53:08 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Jan 23 09:53:08 compute-2 ceph-mon[75771]: 12.5 scrub starts
Jan 23 09:53:08 compute-2 ceph-mon[75771]: 12.5 scrub ok
Jan 23 09:53:08 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e75 e75: 3 total, 3 up, 3 in
Jan 23 09:53:08 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 75 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=67/68 n=6 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=75 pruub=15.869682312s) [1] r=-1 lpr=75 pi=[67,75)/1 crt=62'764 mlcod 0'0 active pruub 82.272300720s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:08 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 75 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=67/68 n=6 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=75 pruub=15.869649887s) [1] r=-1 lpr=75 pi=[67,75)/1 crt=62'764 mlcod 0'0 unknown NOTIFY pruub 82.272300720s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:08 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 75 pg[10.5( v 68'773 (0'0,68'773] local-lis/les=66/67 n=8 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=75 pruub=14.722684860s) [1] r=-1 lpr=75 pi=[66,75)/1 crt=67'771 lcod 67'772 mlcod 67'772 active pruub 81.125633240s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:08 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 75 pg[10.5( v 68'773 (0'0,68'773] local-lis/les=66/67 n=8 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=75 pruub=14.722566605s) [1] r=-1 lpr=75 pi=[66,75)/1 crt=67'771 lcod 67'772 mlcod 0'0 unknown NOTIFY pruub 81.125633240s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:08 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 75 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=67/68 n=6 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=75 pruub=15.868231773s) [1] r=-1 lpr=75 pi=[67,75)/1 crt=62'768 mlcod 0'0 active pruub 82.272315979s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:08 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 75 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=67/68 n=6 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=75 pruub=15.868209839s) [1] r=-1 lpr=75 pi=[67,75)/1 crt=62'768 mlcod 0'0 unknown NOTIFY pruub 82.272315979s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:08 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 75 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=67/68 n=5 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=75 pruub=15.867843628s) [1] r=-1 lpr=75 pi=[67,75)/1 crt=62'759 mlcod 0'0 active pruub 82.272323608s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:08 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 75 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=67/68 n=5 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=75 pruub=15.867819786s) [1] r=-1 lpr=75 pi=[67,75)/1 crt=62'759 mlcod 0'0 unknown NOTIFY pruub 82.272323608s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:08 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e76 e76: 3 total, 3 up, 3 in
Jan 23 09:53:08 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 76 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=67/68 n=6 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=76) [1]/[2] r=0 lpr=76 pi=[67,76)/1 crt=62'764 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:08 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 76 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=67/68 n=6 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=76) [1]/[2] r=0 lpr=76 pi=[67,76)/1 crt=62'764 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:08 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 76 pg[10.5( v 68'773 (0'0,68'773] local-lis/les=66/67 n=8 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=76) [1]/[2] r=0 lpr=76 pi=[66,76)/1 crt=67'771 lcod 67'772 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:08 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 76 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=67/68 n=6 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=76) [1]/[2] r=0 lpr=76 pi=[67,76)/1 crt=62'768 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:08 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 76 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=67/68 n=6 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=76) [1]/[2] r=0 lpr=76 pi=[67,76)/1 crt=62'768 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:08 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 76 pg[10.5( v 68'773 (0'0,68'773] local-lis/les=66/67 n=8 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=76) [1]/[2] r=0 lpr=76 pi=[66,76)/1 crt=67'771 lcod 67'772 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:08 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 76 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=67/68 n=5 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=76) [1]/[2] r=0 lpr=76 pi=[67,76)/1 crt=62'759 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:08 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 76 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=67/68 n=5 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=76) [1]/[2] r=0 lpr=76 pi=[67,76)/1 crt=62'759 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:08 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.8 scrub starts
Jan 23 09:53:08 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.8 scrub ok
Jan 23 09:53:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:08 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:08 compute-2 sudo[83425]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:53:08 compute-2 sudo[83425]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:53:08 compute-2 sudo[83425]: pam_unix(sudo:session): session closed for user root
Jan 23 09:53:09 compute-2 sudo[83450]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/haproxy:2.3 --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:53:09 compute-2 sudo[83450]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:53:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:09 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6848001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:09 compute-2 ceph-mon[75771]: 11.c scrub starts
Jan 23 09:53:09 compute-2 ceph-mon[75771]: 11.c scrub ok
Jan 23 09:53:09 compute-2 ceph-mon[75771]: 8.9 scrub starts
Jan 23 09:53:09 compute-2 ceph-mon[75771]: 8.9 scrub ok
Jan 23 09:53:09 compute-2 ceph-mon[75771]: 11.b scrub starts
Jan 23 09:53:09 compute-2 ceph-mon[75771]: 11.b scrub ok
Jan 23 09:53:09 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Jan 23 09:53:09 compute-2 ceph-mon[75771]: osdmap e75: 3 total, 3 up, 3 in
Jan 23 09:53:09 compute-2 ceph-mon[75771]: osdmap e76: 3 total, 3 up, 3 in
Jan 23 09:53:09 compute-2 ceph-mon[75771]: 12.0 scrub starts
Jan 23 09:53:09 compute-2 ceph-mon[75771]: 12.0 scrub ok
Jan 23 09:53:09 compute-2 ceph-mon[75771]: 9.8 scrub starts
Jan 23 09:53:09 compute-2 ceph-mon[75771]: 9.8 scrub ok
Jan 23 09:53:09 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:09 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:09 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:09 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Jan 23 09:53:09 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e77 e77: 3 total, 3 up, 3 in
Jan 23 09:53:09 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 77 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=76/77 n=5 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=76) [1]/[2] async=[1] r=0 lpr=76 pi=[67,76)/1 crt=62'759 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:09 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 77 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=76/77 n=6 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=76) [1]/[2] async=[1] r=0 lpr=76 pi=[67,76)/1 crt=62'764 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:09 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 77 pg[10.5( v 68'773 (0'0,68'773] local-lis/les=76/77 n=8 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=76) [1]/[2] async=[1] r=0 lpr=76 pi=[66,76)/1 crt=68'773 lcod 67'772 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:09 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 77 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=76/77 n=6 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=76) [1]/[2] async=[1] r=0 lpr=76 pi=[67,76)/1 crt=62'768 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:10 compute-2 ceph-mon[75771]: Deploying daemon haproxy.nfs.cephfs.compute-2.bbaqsj on compute-2
Jan 23 09:53:10 compute-2 ceph-mon[75771]: pgmap v93: 353 pgs: 353 active+clean; 455 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 30 KiB/s rd, 556 B/s wr, 56 op/s; 84 B/s, 4 objects/s recovering
Jan 23 09:53:10 compute-2 ceph-mon[75771]: 11.9 scrub starts
Jan 23 09:53:10 compute-2 ceph-mon[75771]: 11.9 scrub ok
Jan 23 09:53:10 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Jan 23 09:53:10 compute-2 ceph-mon[75771]: osdmap e77: 3 total, 3 up, 3 in
Jan 23 09:53:10 compute-2 ceph-mon[75771]: 12.1f scrub starts
Jan 23 09:53:10 compute-2 ceph-mon[75771]: 12.1f scrub ok
Jan 23 09:53:10 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e77 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:53:10 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e78 e78: 3 total, 3 up, 3 in
Jan 23 09:53:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:10 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6840001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:10 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 78 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=76/77 n=6 ec=59/46 lis/c=76/67 les/c/f=77/68/0 sis=78 pruub=14.722089767s) [1] async=[1] r=-1 lpr=78 pi=[67,78)/1 crt=62'768 mlcod 62'768 active pruub 83.543327332s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:10 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 78 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=76/77 n=5 ec=59/46 lis/c=76/67 les/c/f=77/68/0 sis=78 pruub=14.721847534s) [1] async=[1] r=-1 lpr=78 pi=[67,78)/1 crt=62'759 mlcod 62'759 active pruub 83.543106079s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:10 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 78 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=76/77 n=5 ec=59/46 lis/c=76/67 les/c/f=77/68/0 sis=78 pruub=14.721715927s) [1] r=-1 lpr=78 pi=[67,78)/1 crt=62'759 mlcod 0'0 unknown NOTIFY pruub 83.543106079s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:10 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 78 pg[10.5( v 77'776 (0'0,77'776] local-lis/les=76/77 n=8 ec=59/46 lis/c=76/66 les/c/f=77/67/0 sis=78 pruub=14.721512794s) [1] async=[1] r=-1 lpr=78 pi=[66,78)/1 crt=68'773 lcod 77'775 mlcod 77'775 active pruub 83.543243408s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:10 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 78 pg[10.5( v 77'776 (0'0,77'776] local-lis/les=76/77 n=8 ec=59/46 lis/c=76/66 les/c/f=77/67/0 sis=78 pruub=14.721419334s) [1] r=-1 lpr=78 pi=[66,78)/1 crt=68'773 lcod 77'775 mlcod 0'0 unknown NOTIFY pruub 83.543243408s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:10 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 78 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=76/77 n=6 ec=59/46 lis/c=76/67 les/c/f=77/68/0 sis=78 pruub=14.721409798s) [1] r=-1 lpr=78 pi=[67,78)/1 crt=62'768 mlcod 0'0 unknown NOTIFY pruub 83.543327332s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:10 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 78 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=76/77 n=6 ec=59/46 lis/c=76/67 les/c/f=77/68/0 sis=78 pruub=14.721117973s) [1] async=[1] r=-1 lpr=78 pi=[67,78)/1 crt=62'764 mlcod 62'764 active pruub 83.543182373s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:10 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 78 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=76/77 n=6 ec=59/46 lis/c=76/67 les/c/f=77/68/0 sis=78 pruub=14.721064568s) [1] r=-1 lpr=78 pi=[67,78)/1 crt=62'764 mlcod 0'0 unknown NOTIFY pruub 83.543182373s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:11 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68340016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:11 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e79 e79: 3 total, 3 up, 3 in
Jan 23 09:53:11 compute-2 ceph-mon[75771]: 11.d scrub starts
Jan 23 09:53:11 compute-2 ceph-mon[75771]: 11.d scrub ok
Jan 23 09:53:11 compute-2 ceph-mon[75771]: 12.1b deep-scrub starts
Jan 23 09:53:11 compute-2 ceph-mon[75771]: 12.1b deep-scrub ok
Jan 23 09:53:11 compute-2 ceph-mon[75771]: osdmap e78: 3 total, 3 up, 3 in
Jan 23 09:53:11 compute-2 ceph-mon[75771]: pgmap v96: 353 pgs: 4 active+remapped, 349 active+clean; 455 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 1 op/s; 148 B/s, 6 objects/s recovering
Jan 23 09:53:11 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Jan 23 09:53:11 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 79 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=67/68 n=5 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=79 pruub=12.436098099s) [0] r=-1 lpr=79 pi=[67,79)/1 crt=62'761 mlcod 0'0 active pruub 82.265121460s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:11 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 79 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=67/68 n=5 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=79 pruub=12.436048508s) [0] r=-1 lpr=79 pi=[67,79)/1 crt=62'761 mlcod 0'0 unknown NOTIFY pruub 82.265121460s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:11 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 79 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=79 pruub=11.296006203s) [0] r=-1 lpr=79 pi=[66,79)/1 crt=62'759 mlcod 0'0 active pruub 81.125587463s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:11 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 79 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=79 pruub=11.295859337s) [0] r=-1 lpr=79 pi=[66,79)/1 crt=62'759 mlcod 0'0 unknown NOTIFY pruub 81.125587463s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:11 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 79 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=66/67 n=3 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=79 pruub=11.292468071s) [0] r=-1 lpr=79 pi=[66,79)/1 crt=62'759 mlcod 0'0 active pruub 81.122230530s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:11 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 79 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=66/67 n=3 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=79 pruub=11.292396545s) [0] r=-1 lpr=79 pi=[66,79)/1 crt=62'759 mlcod 0'0 unknown NOTIFY pruub 81.122230530s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:11 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 79 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=66/67 n=7 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=79 pruub=11.295536041s) [0] r=-1 lpr=79 pi=[66,79)/1 crt=62'771 mlcod 0'0 active pruub 81.125885010s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:11 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 79 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=66/67 n=7 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=79 pruub=11.295515060s) [0] r=-1 lpr=79 pi=[66,79)/1 crt=62'771 mlcod 0'0 unknown NOTIFY pruub 81.125885010s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:12 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.1c scrub starts
Jan 23 09:53:12 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.1c scrub ok
Jan 23 09:53:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:12 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68280016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:12 compute-2 ceph-mon[75771]: 11.2 scrub starts
Jan 23 09:53:12 compute-2 ceph-mon[75771]: 11.2 scrub ok
Jan 23 09:53:12 compute-2 ceph-mon[75771]: 12.16 scrub starts
Jan 23 09:53:12 compute-2 ceph-mon[75771]: 12.16 scrub ok
Jan 23 09:53:12 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Jan 23 09:53:12 compute-2 ceph-mon[75771]: osdmap e79: 3 total, 3 up, 3 in
Jan 23 09:53:12 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e80 e80: 3 total, 3 up, 3 in
Jan 23 09:53:12 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 80 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=66/67 n=3 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] r=0 lpr=80 pi=[66,80)/1 crt=62'759 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:12 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 80 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=66/67 n=3 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] r=0 lpr=80 pi=[66,80)/1 crt=62'759 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:12 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 80 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=67/68 n=5 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=80) [0]/[2] r=0 lpr=80 pi=[67,80)/1 crt=62'761 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:12 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 80 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=67/68 n=5 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=80) [0]/[2] r=0 lpr=80 pi=[67,80)/1 crt=62'761 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:12 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 80 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=66/67 n=7 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] r=0 lpr=80 pi=[66,80)/1 crt=62'771 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:12 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 80 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=66/67 n=7 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] r=0 lpr=80 pi=[66,80)/1 crt=62'771 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:12 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 80 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] r=0 lpr=80 pi=[66,80)/1 crt=62'759 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:12 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 80 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] r=0 lpr=80 pi=[66,80)/1 crt=62'759 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:13 compute-2 podman[83515]: 2026-01-23 09:53:13.267203207 +0000 UTC m=+3.757022439 container create 3467097f00d36d1528b6f66c7922dcbfd805c3cef88ae48dd17d1d2fe65cfccb (image=quay.io/ceph/haproxy:2.3, name=boring_nightingale)
Jan 23 09:53:13 compute-2 podman[83515]: 2026-01-23 09:53:13.250488626 +0000 UTC m=+3.740307878 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 23 09:53:13 compute-2 systemd[1]: Started libpod-conmon-3467097f00d36d1528b6f66c7922dcbfd805c3cef88ae48dd17d1d2fe65cfccb.scope.
Jan 23 09:53:13 compute-2 systemd[1]: Started libcrun container.
Jan 23 09:53:13 compute-2 podman[83515]: 2026-01-23 09:53:13.343659 +0000 UTC m=+3.833478242 container init 3467097f00d36d1528b6f66c7922dcbfd805c3cef88ae48dd17d1d2fe65cfccb (image=quay.io/ceph/haproxy:2.3, name=boring_nightingale)
Jan 23 09:53:13 compute-2 podman[83515]: 2026-01-23 09:53:13.352290342 +0000 UTC m=+3.842109584 container start 3467097f00d36d1528b6f66c7922dcbfd805c3cef88ae48dd17d1d2fe65cfccb (image=quay.io/ceph/haproxy:2.3, name=boring_nightingale)
Jan 23 09:53:13 compute-2 podman[83515]: 2026-01-23 09:53:13.356417349 +0000 UTC m=+3.846236601 container attach 3467097f00d36d1528b6f66c7922dcbfd805c3cef88ae48dd17d1d2fe65cfccb (image=quay.io/ceph/haproxy:2.3, name=boring_nightingale)
Jan 23 09:53:13 compute-2 boring_nightingale[83631]: 0 0
Jan 23 09:53:13 compute-2 systemd[1]: libpod-3467097f00d36d1528b6f66c7922dcbfd805c3cef88ae48dd17d1d2fe65cfccb.scope: Deactivated successfully.
Jan 23 09:53:13 compute-2 conmon[83631]: conmon 3467097f00d36d1528b6 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-3467097f00d36d1528b6f66c7922dcbfd805c3cef88ae48dd17d1d2fe65cfccb.scope/container/memory.events
Jan 23 09:53:13 compute-2 podman[83515]: 2026-01-23 09:53:13.359301316 +0000 UTC m=+3.849120548 container died 3467097f00d36d1528b6f66c7922dcbfd805c3cef88ae48dd17d1d2fe65cfccb (image=quay.io/ceph/haproxy:2.3, name=boring_nightingale)
Jan 23 09:53:13 compute-2 systemd[1]: var-lib-containers-storage-overlay-3013f51abecae5064cc04fa32df82e14c54fabed3889a2cee404031f0085f126-merged.mount: Deactivated successfully.
Jan 23 09:53:13 compute-2 podman[83515]: 2026-01-23 09:53:13.41063739 +0000 UTC m=+3.900456622 container remove 3467097f00d36d1528b6f66c7922dcbfd805c3cef88ae48dd17d1d2fe65cfccb (image=quay.io/ceph/haproxy:2.3, name=boring_nightingale)
Jan 23 09:53:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:13 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68480025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:13 compute-2 systemd[1]: libpod-conmon-3467097f00d36d1528b6f66c7922dcbfd805c3cef88ae48dd17d1d2fe65cfccb.scope: Deactivated successfully.
Jan 23 09:53:13 compute-2 systemd[1]: Reloading.
Jan 23 09:53:13 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.1b scrub starts
Jan 23 09:53:13 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.1b scrub ok
Jan 23 09:53:13 compute-2 systemd-sysv-generator[83680]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:53:13 compute-2 systemd-rc-local-generator[83677]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:53:13 compute-2 systemd[1]: Reloading.
Jan 23 09:53:13 compute-2 systemd-rc-local-generator[83712]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:53:13 compute-2 systemd-sysv-generator[83716]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:53:14 compute-2 systemd[1]: Starting Ceph haproxy.nfs.cephfs.compute-2.bbaqsj for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:53:14 compute-2 ceph-mon[75771]: 11.6 scrub starts
Jan 23 09:53:14 compute-2 ceph-mon[75771]: 11.6 scrub ok
Jan 23 09:53:14 compute-2 ceph-mon[75771]: 10.1c scrub starts
Jan 23 09:53:14 compute-2 ceph-mon[75771]: 10.1c scrub ok
Jan 23 09:53:14 compute-2 ceph-mon[75771]: 12.14 scrub starts
Jan 23 09:53:14 compute-2 ceph-mon[75771]: 12.14 scrub ok
Jan 23 09:53:14 compute-2 ceph-mon[75771]: osdmap e80: 3 total, 3 up, 3 in
Jan 23 09:53:14 compute-2 ceph-mon[75771]: pgmap v99: 353 pgs: 4 active+remapped, 349 active+clean; 455 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s; 148 B/s, 6 objects/s recovering
Jan 23 09:53:14 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Jan 23 09:53:14 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e81 e81: 3 total, 3 up, 3 in
Jan 23 09:53:14 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 81 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=80/81 n=3 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] async=[0] r=0 lpr=80 pi=[66,80)/1 crt=62'759 mlcod 0'0 active+remapped mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:14 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 81 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=80/81 n=7 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] async=[0] r=0 lpr=80 pi=[66,80)/1 crt=62'771 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:14 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 81 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=80/81 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] async=[0] r=0 lpr=80 pi=[66,80)/1 crt=62'759 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:14 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 81 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=80/81 n=5 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=80) [0]/[2] async=[0] r=0 lpr=80 pi=[67,80)/1 crt=62'761 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:14 compute-2 podman[83775]: 2026-01-23 09:53:14.307550782 +0000 UTC m=+0.040058120 container create c9fabe49ecac94420d698684f7f0ce6311c7531d91804b2e676e15edf98f3e26 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj)
Jan 23 09:53:14 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/687fed0fbddf5b7be24ce0985318e75d8f4a36ccd1305243ec9ed7a638372074/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Jan 23 09:53:14 compute-2 podman[83775]: 2026-01-23 09:53:14.363312679 +0000 UTC m=+0.095820027 container init c9fabe49ecac94420d698684f7f0ce6311c7531d91804b2e676e15edf98f3e26 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj)
Jan 23 09:53:14 compute-2 podman[83775]: 2026-01-23 09:53:14.368408849 +0000 UTC m=+0.100916187 container start c9fabe49ecac94420d698684f7f0ce6311c7531d91804b2e676e15edf98f3e26 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj)
Jan 23 09:53:14 compute-2 bash[83775]: c9fabe49ecac94420d698684f7f0ce6311c7531d91804b2e676e15edf98f3e26
Jan 23 09:53:14 compute-2 podman[83775]: 2026-01-23 09:53:14.290696437 +0000 UTC m=+0.023203795 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 23 09:53:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [NOTICE] 022/095314 (2) : New worker #1 (4) forked
Jan 23 09:53:14 compute-2 systemd[1]: Started Ceph haproxy.nfs.cephfs.compute-2.bbaqsj for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:53:14 compute-2 sudo[83450]: pam_unix(sudo:session): session closed for user root
Jan 23 09:53:14 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.19 scrub starts
Jan 23 09:53:14 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.19 scrub ok
Jan 23 09:53:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:14 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6840001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:15 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68340016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:15 compute-2 ceph-mon[75771]: 11.18 scrub starts
Jan 23 09:53:15 compute-2 ceph-mon[75771]: 11.18 scrub ok
Jan 23 09:53:15 compute-2 ceph-mon[75771]: 10.1b scrub starts
Jan 23 09:53:15 compute-2 ceph-mon[75771]: 10.1b scrub ok
Jan 23 09:53:15 compute-2 ceph-mon[75771]: 12.1 deep-scrub starts
Jan 23 09:53:15 compute-2 ceph-mon[75771]: 12.1 deep-scrub ok
Jan 23 09:53:15 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Jan 23 09:53:15 compute-2 ceph-mon[75771]: osdmap e81: 3 total, 3 up, 3 in
Jan 23 09:53:15 compute-2 ceph-mon[75771]: 11.1f scrub starts
Jan 23 09:53:15 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:15 compute-2 ceph-mon[75771]: 11.1f scrub ok
Jan 23 09:53:15 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:15 compute-2 ceph-mon[75771]: 10.19 scrub starts
Jan 23 09:53:15 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:15 compute-2 ceph-mon[75771]: 10.19 scrub ok
Jan 23 09:53:15 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:15 compute-2 ceph-mon[75771]: 11.1a scrub starts
Jan 23 09:53:15 compute-2 ceph-mon[75771]: 11.1a scrub ok
Jan 23 09:53:15 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:15 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e82 e82: 3 total, 3 up, 3 in
Jan 23 09:53:15 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 82 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=80/81 n=5 ec=59/46 lis/c=80/67 les/c/f=81/68/0 sis=82 pruub=14.999643326s) [0] async=[0] r=-1 lpr=82 pi=[67,82)/1 crt=62'761 mlcod 62'761 active pruub 88.190757751s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:15 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 82 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=80/81 n=5 ec=59/46 lis/c=80/67 les/c/f=81/68/0 sis=82 pruub=14.999567986s) [0] r=-1 lpr=82 pi=[67,82)/1 crt=62'761 mlcod 0'0 unknown NOTIFY pruub 88.190757751s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:15 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 82 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=80/81 n=5 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82 pruub=14.997087479s) [0] async=[0] r=-1 lpr=82 pi=[66,82)/1 crt=62'759 mlcod 62'759 active pruub 88.190704346s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:15 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 82 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=80/81 n=5 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82 pruub=14.996816635s) [0] r=-1 lpr=82 pi=[66,82)/1 crt=62'759 mlcod 0'0 unknown NOTIFY pruub 88.190704346s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:15 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 82 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=80/81 n=3 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82 pruub=14.996311188s) [0] async=[0] r=-1 lpr=82 pi=[66,82)/1 crt=62'759 mlcod 62'759 active pruub 88.190559387s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:15 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 82 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=80/81 n=3 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82 pruub=14.996195793s) [0] r=-1 lpr=82 pi=[66,82)/1 crt=62'759 mlcod 0'0 unknown NOTIFY pruub 88.190559387s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:15 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 82 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=80/81 n=7 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82 pruub=14.996054649s) [0] async=[0] r=-1 lpr=82 pi=[66,82)/1 crt=62'771 mlcod 62'771 active pruub 88.190666199s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:15 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 82 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=80/81 n=7 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82 pruub=14.995622635s) [0] r=-1 lpr=82 pi=[66,82)/1 crt=62'771 mlcod 0'0 unknown NOTIFY pruub 88.190666199s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:15 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68280016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:15 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.d scrub starts
Jan 23 09:53:15 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.d scrub ok
Jan 23 09:53:15 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:53:16 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e83 e83: 3 total, 3 up, 3 in
Jan 23 09:53:16 compute-2 ceph-mon[75771]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 23 09:53:16 compute-2 ceph-mon[75771]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 23 09:53:16 compute-2 ceph-mon[75771]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 23 09:53:16 compute-2 ceph-mon[75771]: Deploying daemon keepalived.nfs.cephfs.compute-1.vcrquf on compute-1
Jan 23 09:53:16 compute-2 ceph-mon[75771]: pgmap v101: 353 pgs: 4 peering, 349 active+clean; 455 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 3.2 KiB/s rd, 3 op/s; 50 B/s, 4 objects/s recovering
Jan 23 09:53:16 compute-2 ceph-mon[75771]: osdmap e82: 3 total, 3 up, 3 in
Jan 23 09:53:16 compute-2 ceph-mon[75771]: 11.10 scrub starts
Jan 23 09:53:16 compute-2 ceph-mon[75771]: 11.10 scrub ok
Jan 23 09:53:16 compute-2 ceph-mon[75771]: 8.d scrub starts
Jan 23 09:53:16 compute-2 ceph-mon[75771]: 8.d scrub ok
Jan 23 09:53:16 compute-2 ceph-mon[75771]: 11.1e scrub starts
Jan 23 09:53:16 compute-2 ceph-mon[75771]: 11.1e scrub ok
Jan 23 09:53:16 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.a scrub starts
Jan 23 09:53:16 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.a scrub ok
Jan 23 09:53:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:16 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68480025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:17 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6840001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:17 compute-2 ceph-mon[75771]: osdmap e83: 3 total, 3 up, 3 in
Jan 23 09:53:17 compute-2 ceph-mon[75771]: 8.a scrub starts
Jan 23 09:53:17 compute-2 ceph-mon[75771]: 8.a scrub ok
Jan 23 09:53:17 compute-2 ceph-mon[75771]: 11.11 scrub starts
Jan 23 09:53:17 compute-2 ceph-mon[75771]: 11.11 scrub ok
Jan 23 09:53:17 compute-2 ceph-mon[75771]: 11.1c scrub starts
Jan 23 09:53:17 compute-2 ceph-mon[75771]: 11.1c scrub ok
Jan 23 09:53:17 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e84 e84: 3 total, 3 up, 3 in
Jan 23 09:53:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:17 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68340016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:17 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.7 scrub starts
Jan 23 09:53:17 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.7 scrub ok
Jan 23 09:53:18 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.a scrub starts
Jan 23 09:53:18 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.a scrub ok
Jan 23 09:53:18 compute-2 ceph-mon[75771]: pgmap v104: 353 pgs: 4 peering, 349 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 0 objects/s recovering
Jan 23 09:53:18 compute-2 ceph-mon[75771]: osdmap e84: 3 total, 3 up, 3 in
Jan 23 09:53:18 compute-2 ceph-mon[75771]: 9.7 scrub starts
Jan 23 09:53:18 compute-2 ceph-mon[75771]: 9.7 scrub ok
Jan 23 09:53:18 compute-2 ceph-mon[75771]: 12.10 scrub starts
Jan 23 09:53:18 compute-2 ceph-mon[75771]: 11.1b scrub starts
Jan 23 09:53:18 compute-2 ceph-mon[75771]: 12.10 scrub ok
Jan 23 09:53:18 compute-2 ceph-mon[75771]: 11.1b scrub ok
Jan 23 09:53:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:18 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68280016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:18 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e85 e85: 3 total, 3 up, 3 in
Jan 23 09:53:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:19 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68480032d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:19 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6840001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:19 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.16 scrub starts
Jan 23 09:53:19 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.16 scrub ok
Jan 23 09:53:19 compute-2 ceph-mon[75771]: 11.a scrub starts
Jan 23 09:53:19 compute-2 ceph-mon[75771]: 11.a scrub ok
Jan 23 09:53:19 compute-2 ceph-mon[75771]: 11.1d deep-scrub starts
Jan 23 09:53:19 compute-2 ceph-mon[75771]: 12.6 scrub starts
Jan 23 09:53:19 compute-2 ceph-mon[75771]: 11.1d deep-scrub ok
Jan 23 09:53:19 compute-2 ceph-mon[75771]: 12.6 scrub ok
Jan 23 09:53:19 compute-2 ceph-mon[75771]: osdmap e85: 3 total, 3 up, 3 in
Jan 23 09:53:19 compute-2 ceph-mon[75771]: pgmap v107: 353 pgs: 353 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 109 B/s, 3 objects/s recovering
Jan 23 09:53:19 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Jan 23 09:53:19 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:19 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:19 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:19 compute-2 ceph-mon[75771]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 23 09:53:19 compute-2 ceph-mon[75771]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 23 09:53:19 compute-2 ceph-mon[75771]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 23 09:53:19 compute-2 ceph-mon[75771]: Deploying daemon keepalived.nfs.cephfs.compute-0.lrsdkc on compute-0
Jan 23 09:53:19 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e86 e86: 3 total, 3 up, 3 in
Jan 23 09:53:19 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 86 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=67/68 n=7 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=86 pruub=12.381774902s) [0] r=-1 lpr=86 pi=[67,86)/1 crt=62'771 mlcod 0'0 active pruub 90.272521973s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:19 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 86 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=67/68 n=7 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=86 pruub=12.381703377s) [0] r=-1 lpr=86 pi=[67,86)/1 crt=62'771 mlcod 0'0 unknown NOTIFY pruub 90.272521973s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:19 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 86 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=66/67 n=4 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=86 pruub=11.234715462s) [0] r=-1 lpr=86 pi=[66,86)/1 crt=58'754 mlcod 0'0 active pruub 89.125755310s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:19 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 86 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=66/67 n=4 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=86 pruub=11.234684944s) [0] r=-1 lpr=86 pi=[66,86)/1 crt=58'754 mlcod 0'0 unknown NOTIFY pruub 89.125755310s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:20 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.8 scrub starts
Jan 23 09:53:20 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.8 scrub ok
Jan 23 09:53:20 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:53:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:20 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:21 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:21 compute-2 ceph-mon[75771]: 11.4 scrub starts
Jan 23 09:53:21 compute-2 ceph-mon[75771]: 11.4 scrub ok
Jan 23 09:53:21 compute-2 ceph-mon[75771]: 11.16 scrub starts
Jan 23 09:53:21 compute-2 ceph-mon[75771]: 11.16 scrub ok
Jan 23 09:53:21 compute-2 ceph-mon[75771]: 12.c deep-scrub starts
Jan 23 09:53:21 compute-2 ceph-mon[75771]: 12.c deep-scrub ok
Jan 23 09:53:21 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Jan 23 09:53:21 compute-2 ceph-mon[75771]: osdmap e86: 3 total, 3 up, 3 in
Jan 23 09:53:21 compute-2 ceph-mon[75771]: 11.7 scrub starts
Jan 23 09:53:21 compute-2 ceph-mon[75771]: 11.7 scrub ok
Jan 23 09:53:21 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e87 e87: 3 total, 3 up, 3 in
Jan 23 09:53:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 87 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=67/68 n=7 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=87) [0]/[2] r=0 lpr=87 pi=[67,87)/1 crt=62'771 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 87 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=67/68 n=7 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=87) [0]/[2] r=0 lpr=87 pi=[67,87)/1 crt=62'771 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 87 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=66/67 n=4 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=87) [0]/[2] r=0 lpr=87 pi=[66,87)/1 crt=58'754 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:21 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 87 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=66/67 n=4 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=87) [0]/[2] r=0 lpr=87 pi=[66,87)/1 crt=58'754 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:21 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68480032d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:21 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.13 scrub starts
Jan 23 09:53:21 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.13 scrub ok
Jan 23 09:53:22 compute-2 ceph-mon[75771]: 12.12 scrub starts
Jan 23 09:53:22 compute-2 ceph-mon[75771]: 11.8 scrub starts
Jan 23 09:53:22 compute-2 ceph-mon[75771]: 12.12 scrub ok
Jan 23 09:53:22 compute-2 ceph-mon[75771]: 11.8 scrub ok
Jan 23 09:53:22 compute-2 ceph-mon[75771]: pgmap v109: 353 pgs: 353 active+clean; 457 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 6 objects/s recovering
Jan 23 09:53:22 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Jan 23 09:53:22 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Jan 23 09:53:22 compute-2 ceph-mon[75771]: osdmap e87: 3 total, 3 up, 3 in
Jan 23 09:53:22 compute-2 ceph-mon[75771]: 11.f deep-scrub starts
Jan 23 09:53:22 compute-2 ceph-mon[75771]: 11.f deep-scrub ok
Jan 23 09:53:22 compute-2 ceph-mon[75771]: 12.b scrub starts
Jan 23 09:53:22 compute-2 ceph-mon[75771]: 12.b scrub ok
Jan 23 09:53:22 compute-2 ceph-mon[75771]: 11.13 scrub starts
Jan 23 09:53:22 compute-2 ceph-mon[75771]: 11.13 scrub ok
Jan 23 09:53:22 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e88 e88: 3 total, 3 up, 3 in
Jan 23 09:53:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 88 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=87/88 n=7 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=87) [0]/[2] async=[0] r=0 lpr=87 pi=[67,87)/1 crt=62'771 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:22 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 88 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=87/88 n=4 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=87) [0]/[2] async=[0] r=0 lpr=87 pi=[66,87)/1 crt=58'754 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:22 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.17 scrub starts
Jan 23 09:53:22 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.17 scrub ok
Jan 23 09:53:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:22 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68480032d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:23 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:23 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e89 e89: 3 total, 3 up, 3 in
Jan 23 09:53:23 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 89 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=87/88 n=4 ec=59/46 lis/c=87/66 les/c/f=88/67/0 sis=89 pruub=14.983925819s) [0] async=[0] r=-1 lpr=89 pi=[66,89)/1 crt=58'754 mlcod 58'754 active pruub 96.317939758s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:23 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 89 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=87/88 n=4 ec=59/46 lis/c=87/66 les/c/f=88/67/0 sis=89 pruub=14.983826637s) [0] r=-1 lpr=89 pi=[66,89)/1 crt=58'754 mlcod 0'0 unknown NOTIFY pruub 96.317939758s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:23 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 89 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=87/88 n=7 ec=59/46 lis/c=87/67 les/c/f=88/68/0 sis=89 pruub=14.983630180s) [0] async=[0] r=-1 lpr=89 pi=[67,89)/1 crt=62'771 mlcod 62'771 active pruub 96.317901611s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:23 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 89 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=66/67 n=2 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=89 pruub=15.791469574s) [0] r=-1 lpr=89 pi=[66,89)/1 crt=60'756 mlcod 0'0 active pruub 97.125907898s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:23 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 89 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=66/67 n=2 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=89 pruub=15.791426659s) [0] r=-1 lpr=89 pi=[66,89)/1 crt=60'756 mlcod 0'0 unknown NOTIFY pruub 97.125907898s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:23 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 89 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=87/88 n=7 ec=59/46 lis/c=87/67 les/c/f=88/68/0 sis=89 pruub=14.983448982s) [0] r=-1 lpr=89 pi=[67,89)/1 crt=62'771 mlcod 0'0 unknown NOTIFY pruub 96.317901611s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:23 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 89 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=89 pruub=15.790423393s) [0] r=-1 lpr=89 pi=[66,89)/1 crt=62'759 mlcod 0'0 active pruub 97.126426697s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:23 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 89 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=89 pruub=15.790397644s) [0] r=-1 lpr=89 pi=[66,89)/1 crt=62'759 mlcod 0'0 unknown NOTIFY pruub 97.126426697s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:23 compute-2 ceph-mon[75771]: osdmap e88: 3 total, 3 up, 3 in
Jan 23 09:53:23 compute-2 ceph-mon[75771]: 11.17 scrub starts
Jan 23 09:53:23 compute-2 ceph-mon[75771]: 11.17 scrub ok
Jan 23 09:53:23 compute-2 ceph-mon[75771]: 12.e scrub starts
Jan 23 09:53:23 compute-2 ceph-mon[75771]: 12.e scrub ok
Jan 23 09:53:23 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Jan 23 09:53:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:23 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:23 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.1d scrub starts
Jan 23 09:53:23 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.1d scrub ok
Jan 23 09:53:23 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e90 e90: 3 total, 3 up, 3 in
Jan 23 09:53:23 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 90 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=66/67 n=2 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=90) [0]/[2] r=0 lpr=90 pi=[66,90)/1 crt=60'756 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:23 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 90 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=66/67 n=2 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=90) [0]/[2] r=0 lpr=90 pi=[66,90)/1 crt=60'756 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:23 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 90 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=90) [0]/[2] r=0 lpr=90 pi=[66,90)/1 crt=62'759 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:23 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 90 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=90) [0]/[2] r=0 lpr=90 pi=[66,90)/1 crt=62'759 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:24 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.1e scrub starts
Jan 23 09:53:24 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.1e scrub ok
Jan 23 09:53:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:24 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68480032d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:25 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:25 compute-2 ceph-mon[75771]: pgmap v112: 353 pgs: 353 active+clean; 457 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 7 objects/s recovering
Jan 23 09:53:25 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Jan 23 09:53:25 compute-2 ceph-mon[75771]: osdmap e89: 3 total, 3 up, 3 in
Jan 23 09:53:25 compute-2 ceph-mon[75771]: 10.16 scrub starts
Jan 23 09:53:25 compute-2 ceph-mon[75771]: 10.16 scrub ok
Jan 23 09:53:25 compute-2 ceph-mon[75771]: 12.1d scrub starts
Jan 23 09:53:25 compute-2 ceph-mon[75771]: 12.1d scrub ok
Jan 23 09:53:25 compute-2 ceph-mon[75771]: osdmap e90: 3 total, 3 up, 3 in
Jan 23 09:53:25 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e91 e91: 3 total, 3 up, 3 in
Jan 23 09:53:25 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 91 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=90/91 n=2 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=90) [0]/[2] async=[0] r=0 lpr=90 pi=[66,90)/1 crt=60'756 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:25 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 91 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=90/91 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=90) [0]/[2] async=[0] r=0 lpr=90 pi=[66,90)/1 crt=62'759 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:25 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:25 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.2 scrub starts
Jan 23 09:53:25 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.2 scrub ok
Jan 23 09:53:25 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:53:25 compute-2 sudo[83804]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:53:25 compute-2 sudo[83804]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:53:25 compute-2 sudo[83804]: pam_unix(sudo:session): session closed for user root
Jan 23 09:53:25 compute-2 sudo[83829]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/keepalived:2.2.4 --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:53:25 compute-2 sudo[83829]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:53:26 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.3 scrub starts
Jan 23 09:53:26 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.3 scrub ok
Jan 23 09:53:26 compute-2 ceph-mon[75771]: 10.0 scrub starts
Jan 23 09:53:26 compute-2 ceph-mon[75771]: 10.0 scrub ok
Jan 23 09:53:26 compute-2 ceph-mon[75771]: 12.1e scrub starts
Jan 23 09:53:26 compute-2 ceph-mon[75771]: 12.1e scrub ok
Jan 23 09:53:26 compute-2 ceph-mon[75771]: pgmap v115: 353 pgs: 2 peering, 351 active+clean; 457 KiB data, 126 MiB used, 60 GiB / 60 GiB avail; 82 B/s, 4 objects/s recovering
Jan 23 09:53:26 compute-2 ceph-mon[75771]: 10.f scrub starts
Jan 23 09:53:26 compute-2 ceph-mon[75771]: 10.f scrub ok
Jan 23 09:53:26 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:26 compute-2 ceph-mon[75771]: osdmap e91: 3 total, 3 up, 3 in
Jan 23 09:53:26 compute-2 ceph-mon[75771]: 12.2 scrub starts
Jan 23 09:53:26 compute-2 ceph-mon[75771]: 12.2 scrub ok
Jan 23 09:53:26 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:26 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:26 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:26 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:27 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e92 e92: 3 total, 3 up, 3 in
Jan 23 09:53:27 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 92 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=90/91 n=2 ec=59/46 lis/c=90/66 les/c/f=91/67/0 sis=92 pruub=14.362304688s) [0] async=[0] r=-1 lpr=92 pi=[66,92)/1 crt=60'756 mlcod 60'756 active pruub 99.363479614s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:27 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 92 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=90/91 n=2 ec=59/46 lis/c=90/66 les/c/f=91/67/0 sis=92 pruub=14.362131119s) [0] r=-1 lpr=92 pi=[66,92)/1 crt=60'756 mlcod 0'0 unknown NOTIFY pruub 99.363479614s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:27 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 92 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=90/91 n=5 ec=59/46 lis/c=90/66 les/c/f=91/67/0 sis=92 pruub=14.364026070s) [0] async=[0] r=-1 lpr=92 pi=[66,92)/1 crt=62'759 mlcod 62'759 active pruub 99.365730286s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:27 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 92 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=90/91 n=5 ec=59/46 lis/c=90/66 les/c/f=91/67/0 sis=92 pruub=14.363988876s) [0] r=-1 lpr=92 pi=[66,92)/1 crt=62'759 mlcod 0'0 unknown NOTIFY pruub 99.365730286s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:27 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68480047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:27 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:27 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.1a scrub starts
Jan 23 09:53:27 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.1a scrub ok
Jan 23 09:53:28 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.9 scrub starts
Jan 23 09:53:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:28 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:29 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:29 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.9 scrub ok
Jan 23 09:53:29 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.4 scrub starts
Jan 23 09:53:29 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.4 scrub ok
Jan 23 09:53:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:29 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:29 compute-2 ceph-mon[75771]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 23 09:53:29 compute-2 ceph-mon[75771]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 23 09:53:29 compute-2 ceph-mon[75771]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 23 09:53:29 compute-2 ceph-mon[75771]: Deploying daemon keepalived.nfs.cephfs.compute-2.pawaai on compute-2
Jan 23 09:53:29 compute-2 ceph-mon[75771]: 10.e scrub starts
Jan 23 09:53:29 compute-2 ceph-mon[75771]: 10.e scrub ok
Jan 23 09:53:29 compute-2 ceph-mon[75771]: 12.3 scrub starts
Jan 23 09:53:29 compute-2 ceph-mon[75771]: 12.3 scrub ok
Jan 23 09:53:29 compute-2 ceph-mon[75771]: osdmap e92: 3 total, 3 up, 3 in
Jan 23 09:53:29 compute-2 ceph-mon[75771]: pgmap v118: 353 pgs: 1 active+recovering+remapped, 1 active+remapped, 2 peering, 349 active+clean; 457 KiB data, 126 MiB used, 60 GiB / 60 GiB avail; 2/221 objects misplaced (0.905%); 137 B/s, 5 objects/s recovering
Jan 23 09:53:29 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e93 e93: 3 total, 3 up, 3 in
Jan 23 09:53:30 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.7 scrub starts
Jan 23 09:53:30 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.7 scrub ok
Jan 23 09:53:30 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:53:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:30 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:30 compute-2 podman[83893]: 2026-01-23 09:53:30.897199248 +0000 UTC m=+4.719883928 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 23 09:53:30 compute-2 podman[83893]: 2026-01-23 09:53:30.913604303 +0000 UTC m=+4.736288963 container create 26719e2240007fa1f7b99de8994dafe111adcda7d392677253f701311a437943 (image=quay.io/ceph/keepalived:2.2.4, name=elastic_blackburn, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.28.2, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, description=keepalived for Ceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, version=2.2.4, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 23 09:53:30 compute-2 ceph-mon[75771]: 10.6 deep-scrub starts
Jan 23 09:53:30 compute-2 ceph-mon[75771]: 10.6 deep-scrub ok
Jan 23 09:53:30 compute-2 ceph-mon[75771]: 12.1a scrub starts
Jan 23 09:53:30 compute-2 ceph-mon[75771]: 12.1a scrub ok
Jan 23 09:53:30 compute-2 ceph-mon[75771]: 11.1 scrub starts
Jan 23 09:53:30 compute-2 ceph-mon[75771]: 11.1 scrub ok
Jan 23 09:53:30 compute-2 ceph-mon[75771]: 12.9 scrub starts
Jan 23 09:53:30 compute-2 ceph-mon[75771]: pgmap v119: 353 pgs: 2 peering, 351 active+clean; 457 KiB data, 126 MiB used, 60 GiB / 60 GiB avail; 19 B/s, 0 objects/s recovering
Jan 23 09:53:30 compute-2 ceph-mon[75771]: 11.12 deep-scrub starts
Jan 23 09:53:30 compute-2 ceph-mon[75771]: 11.12 deep-scrub ok
Jan 23 09:53:30 compute-2 ceph-mon[75771]: 12.9 scrub ok
Jan 23 09:53:30 compute-2 ceph-mon[75771]: 12.4 scrub starts
Jan 23 09:53:30 compute-2 ceph-mon[75771]: 12.4 scrub ok
Jan 23 09:53:30 compute-2 ceph-mon[75771]: 12.a scrub starts
Jan 23 09:53:30 compute-2 ceph-mon[75771]: 12.a scrub ok
Jan 23 09:53:30 compute-2 ceph-mon[75771]: osdmap e93: 3 total, 3 up, 3 in
Jan 23 09:53:30 compute-2 systemd[1]: Started libpod-conmon-26719e2240007fa1f7b99de8994dafe111adcda7d392677253f701311a437943.scope.
Jan 23 09:53:30 compute-2 systemd[1]: Started libcrun container.
Jan 23 09:53:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:31 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:31 compute-2 podman[83893]: 2026-01-23 09:53:31.232819155 +0000 UTC m=+5.055503815 container init 26719e2240007fa1f7b99de8994dafe111adcda7d392677253f701311a437943 (image=quay.io/ceph/keepalived:2.2.4, name=elastic_blackburn, distribution-scope=public, name=keepalived, io.openshift.expose-services=, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, description=keepalived for Ceph, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, architecture=x86_64, release=1793, vendor=Red Hat, Inc.)
Jan 23 09:53:31 compute-2 podman[83893]: 2026-01-23 09:53:31.241743674 +0000 UTC m=+5.064428334 container start 26719e2240007fa1f7b99de8994dafe111adcda7d392677253f701311a437943 (image=quay.io/ceph/keepalived:2.2.4, name=elastic_blackburn, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., release=1793, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, io.openshift.expose-services=, io.buildah.version=1.28.2)
Jan 23 09:53:31 compute-2 podman[83893]: 2026-01-23 09:53:31.246241989 +0000 UTC m=+5.068926669 container attach 26719e2240007fa1f7b99de8994dafe111adcda7d392677253f701311a437943 (image=quay.io/ceph/keepalived:2.2.4, name=elastic_blackburn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., name=keepalived, vcs-type=git, distribution-scope=public, architecture=x86_64, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 23 09:53:31 compute-2 elastic_blackburn[83991]: 0 0
Jan 23 09:53:31 compute-2 systemd[1]: libpod-26719e2240007fa1f7b99de8994dafe111adcda7d392677253f701311a437943.scope: Deactivated successfully.
Jan 23 09:53:31 compute-2 podman[83893]: 2026-01-23 09:53:31.24928636 +0000 UTC m=+5.071971020 container died 26719e2240007fa1f7b99de8994dafe111adcda7d392677253f701311a437943 (image=quay.io/ceph/keepalived:2.2.4, name=elastic_blackburn, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, description=keepalived for Ceph, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, vcs-type=git, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, io.buildah.version=1.28.2, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20)
Jan 23 09:53:31 compute-2 systemd[1]: var-lib-containers-storage-overlay-99bd1a1218ddcc9282978193efca963c2b61900d03748cb8bf171d2e03c61cfb-merged.mount: Deactivated successfully.
Jan 23 09:53:31 compute-2 podman[83893]: 2026-01-23 09:53:31.29662684 +0000 UTC m=+5.119311670 container remove 26719e2240007fa1f7b99de8994dafe111adcda7d392677253f701311a437943 (image=quay.io/ceph/keepalived:2.2.4, name=elastic_blackburn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, name=keepalived, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, com.redhat.component=keepalived-container, release=1793, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 23 09:53:31 compute-2 systemd[1]: libpod-conmon-26719e2240007fa1f7b99de8994dafe111adcda7d392677253f701311a437943.scope: Deactivated successfully.
Jan 23 09:53:31 compute-2 systemd[1]: Reloading.
Jan 23 09:53:31 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.19 deep-scrub starts
Jan 23 09:53:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:31 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:31 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.19 deep-scrub ok
Jan 23 09:53:31 compute-2 systemd-rc-local-generator[84043]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:53:31 compute-2 systemd-sysv-generator[84046]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:53:31 compute-2 systemd[1]: Reloading.
Jan 23 09:53:31 compute-2 systemd-rc-local-generator[84080]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:53:31 compute-2 systemd-sysv-generator[84084]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:53:31 compute-2 systemd[1]: Starting Ceph keepalived.nfs.cephfs.compute-2.pawaai for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:53:31 compute-2 ceph-mon[75771]: 11.5 scrub starts
Jan 23 09:53:31 compute-2 ceph-mon[75771]: 11.5 scrub ok
Jan 23 09:53:31 compute-2 ceph-mon[75771]: 12.7 scrub starts
Jan 23 09:53:31 compute-2 ceph-mon[75771]: 12.7 scrub ok
Jan 23 09:53:31 compute-2 ceph-mon[75771]: 12.8 deep-scrub starts
Jan 23 09:53:31 compute-2 ceph-mon[75771]: 12.8 deep-scrub ok
Jan 23 09:53:31 compute-2 ceph-mon[75771]: pgmap v121: 353 pgs: 2 peering, 351 active+clean; 457 KiB data, 126 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:31 compute-2 ceph-mon[75771]: 11.14 deep-scrub starts
Jan 23 09:53:31 compute-2 ceph-mon[75771]: 11.14 deep-scrub ok
Jan 23 09:53:32 compute-2 podman[84139]: 2026-01-23 09:53:32.140566301 +0000 UTC m=+0.045486847 container create 5f2fa996bf6c111f1884c2f59ff9b058d96f90c6e8935be1b7683b38269ec7aa (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, release=1793, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.28.2, version=2.2.4, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc.)
Jan 23 09:53:32 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12f2c7cd8a6b52dfd2865263f41bdb1091228e98f9902071822660490c39b1ea/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:53:32 compute-2 podman[84139]: 2026-01-23 09:53:32.20748226 +0000 UTC m=+0.112402806 container init 5f2fa996bf6c111f1884c2f59ff9b058d96f90c6e8935be1b7683b38269ec7aa (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai, io.buildah.version=1.28.2, version=2.2.4, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, release=1793, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, architecture=x86_64, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., distribution-scope=public, name=keepalived, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Jan 23 09:53:32 compute-2 podman[84139]: 2026-01-23 09:53:32.212909747 +0000 UTC m=+0.117830293 container start 5f2fa996bf6c111f1884c2f59ff9b058d96f90c6e8935be1b7683b38269ec7aa (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai, vcs-type=git, io.openshift.expose-services=, version=2.2.4, build-date=2023-02-22T09:23:20, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, architecture=x86_64, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph., release=1793, name=keepalived)
Jan 23 09:53:32 compute-2 podman[84139]: 2026-01-23 09:53:32.121522255 +0000 UTC m=+0.026442821 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 23 09:53:32 compute-2 bash[84139]: 5f2fa996bf6c111f1884c2f59ff9b058d96f90c6e8935be1b7683b38269ec7aa
Jan 23 09:53:32 compute-2 systemd[1]: Started Ceph keepalived.nfs.cephfs.compute-2.pawaai for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:53:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:53:32 2026: Starting Keepalived v2.2.4 (08/21,2021)
Jan 23 09:53:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:53:32 2026: Running on Linux 5.14.0-661.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026 (built for Linux 5.14.0)
Jan 23 09:53:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:53:32 2026: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Jan 23 09:53:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:53:32 2026: Configuration file /etc/keepalived/keepalived.conf
Jan 23 09:53:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:53:32 2026: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Jan 23 09:53:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:53:32 2026: Starting VRRP child process, pid=4
Jan 23 09:53:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:53:32 2026: Startup complete
Jan 23 09:53:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:53:32 2026: (VI_0) Entering BACKUP STATE (init)
Jan 23 09:53:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:53:32 2026: VRRP_Script(check_backend) succeeded
Jan 23 09:53:32 compute-2 sudo[83829]: pam_unix(sudo:session): session closed for user root
Jan 23 09:53:32 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.17 scrub starts
Jan 23 09:53:32 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.17 scrub ok
Jan 23 09:53:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:32 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:32 compute-2 ceph-mon[75771]: 11.19 deep-scrub starts
Jan 23 09:53:32 compute-2 ceph-mon[75771]: 11.19 deep-scrub ok
Jan 23 09:53:32 compute-2 ceph-mon[75771]: 12.1c scrub starts
Jan 23 09:53:32 compute-2 ceph-mon[75771]: 12.1c scrub ok
Jan 23 09:53:32 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:32 compute-2 ceph-mon[75771]: 10.1e scrub starts
Jan 23 09:53:32 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:32 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:32 compute-2 ceph-mon[75771]: 10.1e scrub ok
Jan 23 09:53:32 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:32 compute-2 ceph-mon[75771]: Deploying daemon alertmanager.compute-0 on compute-0
Jan 23 09:53:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:33 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:33 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.11 scrub starts
Jan 23 09:53:33 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.11 scrub ok
Jan 23 09:53:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:33 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:34 compute-2 ceph-mon[75771]: 12.17 scrub starts
Jan 23 09:53:34 compute-2 ceph-mon[75771]: 12.17 scrub ok
Jan 23 09:53:34 compute-2 ceph-mon[75771]: 12.19 scrub starts
Jan 23 09:53:34 compute-2 ceph-mon[75771]: 12.19 scrub ok
Jan 23 09:53:34 compute-2 ceph-mon[75771]: pgmap v122: 353 pgs: 2 peering, 351 active+clean; 457 KiB data, 126 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:34 compute-2 ceph-mon[75771]: 10.1f scrub starts
Jan 23 09:53:34 compute-2 ceph-mon[75771]: 10.1f scrub ok
Jan 23 09:53:34 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.13 scrub starts
Jan 23 09:53:34 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.13 scrub ok
Jan 23 09:53:34 compute-2 sshd-session[84164]: Accepted publickey for zuul from 192.168.122.30 port 45662 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:53:34 compute-2 systemd-logind[786]: New session 36 of user zuul.
Jan 23 09:53:34 compute-2 systemd[1]: Started Session 36 of User zuul.
Jan 23 09:53:34 compute-2 sshd-session[84164]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:53:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:34 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:35 compute-2 ceph-mon[75771]: 12.11 scrub starts
Jan 23 09:53:35 compute-2 ceph-mon[75771]: 12.11 scrub ok
Jan 23 09:53:35 compute-2 ceph-mon[75771]: 10.15 scrub starts
Jan 23 09:53:35 compute-2 ceph-mon[75771]: 10.15 scrub ok
Jan 23 09:53:35 compute-2 ceph-mon[75771]: 10.10 scrub starts
Jan 23 09:53:35 compute-2 ceph-mon[75771]: 10.10 scrub ok
Jan 23 09:53:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:35 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:35 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.18 scrub starts
Jan 23 09:53:35 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.18 scrub ok
Jan 23 09:53:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:35 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:35 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:53:35 compute-2 python3.9[84317]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:53:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:53:35 2026: (VI_0) Entering MASTER STATE
Jan 23 09:53:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:53:35 2026: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Jan 23 09:53:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:53:35 2026: (VI_0) Entering BACKUP STATE
Jan 23 09:53:36 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e94 e94: 3 total, 3 up, 3 in
Jan 23 09:53:36 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 94 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=73/74 n=7 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=94 pruub=8.604924202s) [0] r=-1 lpr=94 pi=[73,94)/1 crt=62'764 mlcod 0'0 active pruub 102.616806030s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:36 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 94 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=73/74 n=7 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=94 pruub=8.604826927s) [0] r=-1 lpr=94 pi=[73,94)/1 crt=62'764 mlcod 0'0 unknown NOTIFY pruub 102.616806030s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:36 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 94 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=73/74 n=6 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=94 pruub=8.607663155s) [0] r=-1 lpr=94 pi=[73,94)/1 crt=62'764 mlcod 0'0 active pruub 102.620361328s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:36 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 94 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=73/74 n=6 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=94 pruub=8.607626915s) [0] r=-1 lpr=94 pi=[73,94)/1 crt=62'764 mlcod 0'0 unknown NOTIFY pruub 102.620361328s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:36 compute-2 ceph-mon[75771]: 12.13 scrub starts
Jan 23 09:53:36 compute-2 ceph-mon[75771]: 12.13 scrub ok
Jan 23 09:53:36 compute-2 ceph-mon[75771]: 10.12 scrub starts
Jan 23 09:53:36 compute-2 ceph-mon[75771]: 10.12 scrub ok
Jan 23 09:53:36 compute-2 ceph-mon[75771]: pgmap v123: 353 pgs: 353 active+clean; 456 KiB data, 126 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:36 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Jan 23 09:53:36 compute-2 ceph-mon[75771]: 10.9 scrub starts
Jan 23 09:53:36 compute-2 ceph-mon[75771]: 10.9 scrub ok
Jan 23 09:53:36 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:36 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.3 scrub starts
Jan 23 09:53:36 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.3 scrub ok
Jan 23 09:53:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:36 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:37 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:37 compute-2 sudo[84533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmoxldodzbzfwyjjpduexalgqisstmvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162016.9625785-54-5077815476572/AnsiballZ_command.py'
Jan 23 09:53:37 compute-2 sudo[84533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:53:37 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.e scrub starts
Jan 23 09:53:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:37 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:37 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.e scrub ok
Jan 23 09:53:38 compute-2 python3.9[84535]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:53:38 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.4 deep-scrub starts
Jan 23 09:53:38 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.4 deep-scrub ok
Jan 23 09:53:38 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e95 e95: 3 total, 3 up, 3 in
Jan 23 09:53:38 compute-2 ceph-mon[75771]: 12.18 scrub starts
Jan 23 09:53:38 compute-2 ceph-mon[75771]: 10.d scrub starts
Jan 23 09:53:38 compute-2 ceph-mon[75771]: 12.18 scrub ok
Jan 23 09:53:38 compute-2 ceph-mon[75771]: 10.d scrub ok
Jan 23 09:53:38 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Jan 23 09:53:38 compute-2 ceph-mon[75771]: osdmap e94: 3 total, 3 up, 3 in
Jan 23 09:53:38 compute-2 ceph-mon[75771]: 10.a scrub starts
Jan 23 09:53:38 compute-2 ceph-mon[75771]: 10.a scrub ok
Jan 23 09:53:38 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:38 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 95 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=73/74 n=6 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=95) [0]/[2] r=0 lpr=95 pi=[73,95)/1 crt=62'764 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:38 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 95 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=73/74 n=7 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=95) [0]/[2] r=0 lpr=95 pi=[73,95)/1 crt=62'764 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:38 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 95 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=73/74 n=6 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=95) [0]/[2] r=0 lpr=95 pi=[73,95)/1 crt=62'764 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:38 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 95 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=73/74 n=7 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=95) [0]/[2] r=0 lpr=95 pi=[73,95)/1 crt=62'764 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:38 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:39 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:39 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c002720 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:39 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.1 scrub starts
Jan 23 09:53:39 compute-2 ceph-mon[75771]: 11.3 scrub starts
Jan 23 09:53:39 compute-2 ceph-mon[75771]: 11.3 scrub ok
Jan 23 09:53:39 compute-2 ceph-mon[75771]: 10.8 scrub starts
Jan 23 09:53:39 compute-2 ceph-mon[75771]: 10.8 scrub ok
Jan 23 09:53:39 compute-2 ceph-mon[75771]: pgmap v125: 353 pgs: 353 active+clean; 456 KiB data, 126 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:39 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Jan 23 09:53:39 compute-2 ceph-mon[75771]: 10.b deep-scrub starts
Jan 23 09:53:39 compute-2 ceph-mon[75771]: 10.b deep-scrub ok
Jan 23 09:53:39 compute-2 ceph-mon[75771]: 11.e scrub starts
Jan 23 09:53:39 compute-2 ceph-mon[75771]: 11.e scrub ok
Jan 23 09:53:39 compute-2 ceph-mon[75771]: 10.2 scrub starts
Jan 23 09:53:39 compute-2 ceph-mon[75771]: 10.2 scrub ok
Jan 23 09:53:39 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:39 compute-2 ceph-mon[75771]: 10.1a scrub starts
Jan 23 09:53:39 compute-2 ceph-mon[75771]: 10.1a scrub ok
Jan 23 09:53:39 compute-2 ceph-mon[75771]: 10.18 scrub starts
Jan 23 09:53:39 compute-2 ceph-mon[75771]: 10.4 deep-scrub starts
Jan 23 09:53:39 compute-2 ceph-mon[75771]: 10.18 scrub ok
Jan 23 09:53:39 compute-2 ceph-mon[75771]: 10.4 deep-scrub ok
Jan 23 09:53:39 compute-2 ceph-mon[75771]: osdmap e95: 3 total, 3 up, 3 in
Jan 23 09:53:39 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:39 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:39 compute-2 ceph-mon[75771]: Regenerating cephadm self-signed grafana TLS certificates
Jan 23 09:53:39 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:39 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:39 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Jan 23 09:53:39 compute-2 ceph-mon[75771]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Jan 23 09:53:39 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:39 compute-2 ceph-mon[75771]: Deploying daemon grafana.compute-0 on compute-0
Jan 23 09:53:39 compute-2 ceph-mon[75771]: pgmap v127: 353 pgs: 353 active+clean; 456 KiB data, 126 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:39 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Jan 23 09:53:39 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.1 scrub ok
Jan 23 09:53:39 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e96 e96: 3 total, 3 up, 3 in
Jan 23 09:53:39 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 96 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=95/96 n=6 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=95) [0]/[2] async=[0] r=0 lpr=95 pi=[73,95)/1 crt=62'764 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:39 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 96 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=95/96 n=7 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=95) [0]/[2] async=[0] r=0 lpr=95 pi=[73,95)/1 crt=62'764 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:40 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.13 scrub starts
Jan 23 09:53:40 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.13 scrub ok
Jan 23 09:53:40 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:53:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:40 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:40 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e97 e97: 3 total, 3 up, 3 in
Jan 23 09:53:40 compute-2 ceph-mon[75771]: 10.1 scrub starts
Jan 23 09:53:40 compute-2 ceph-mon[75771]: 10.1d scrub starts
Jan 23 09:53:40 compute-2 ceph-mon[75771]: 10.1d scrub ok
Jan 23 09:53:40 compute-2 ceph-mon[75771]: 10.1 scrub ok
Jan 23 09:53:40 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Jan 23 09:53:40 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Jan 23 09:53:40 compute-2 ceph-mon[75771]: osdmap e96: 3 total, 3 up, 3 in
Jan 23 09:53:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 97 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=95/96 n=7 ec=59/46 lis/c=95/73 les/c/f=96/74/0 sis=97 pruub=14.590447426s) [0] async=[0] r=-1 lpr=97 pi=[73,97)/1 crt=62'764 mlcod 62'764 active pruub 113.512527466s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 97 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=95/96 n=7 ec=59/46 lis/c=95/73 les/c/f=96/74/0 sis=97 pruub=14.590225220s) [0] r=-1 lpr=97 pi=[73,97)/1 crt=62'764 mlcod 0'0 unknown NOTIFY pruub 113.512527466s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 97 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=95/96 n=6 ec=59/46 lis/c=95/73 les/c/f=96/74/0 sis=97 pruub=14.589948654s) [0] async=[0] r=-1 lpr=97 pi=[73,97)/1 crt=62'764 mlcod 62'764 active pruub 113.512481689s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 97 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=95/96 n=6 ec=59/46 lis/c=95/73 les/c/f=96/74/0 sis=97 pruub=14.589887619s) [0] r=-1 lpr=97 pi=[73,97)/1 crt=62'764 mlcod 0'0 unknown NOTIFY pruub 113.512481689s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:41 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:41 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:41 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.11 scrub starts
Jan 23 09:53:41 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.11 scrub ok
Jan 23 09:53:42 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e98 e98: 3 total, 3 up, 3 in
Jan 23 09:53:42 compute-2 ceph-mon[75771]: 10.13 scrub starts
Jan 23 09:53:42 compute-2 ceph-mon[75771]: 10.13 scrub ok
Jan 23 09:53:42 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:42 compute-2 ceph-mon[75771]: osdmap e97: 3 total, 3 up, 3 in
Jan 23 09:53:42 compute-2 ceph-mon[75771]: pgmap v130: 353 pgs: 2 active+remapped, 351 active+clean; 456 KiB data, 126 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:42 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:53:42.313344) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162022313535, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 7502, "num_deletes": 257, "total_data_size": 18548443, "memory_usage": 19292720, "flush_reason": "Manual Compaction"}
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162022450002, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 11454841, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 251, "largest_seqno": 7507, "table_properties": {"data_size": 11424610, "index_size": 19300, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9861, "raw_key_size": 96717, "raw_average_key_size": 24, "raw_value_size": 11348944, "raw_average_value_size": 2879, "num_data_blocks": 851, "num_entries": 3941, "num_filter_entries": 3941, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 1769161840, "file_creation_time": 1769162022, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 136751 microseconds, and 62105 cpu microseconds.
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:53:42.450157) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 11454841 bytes OK
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:53:42.450189) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:53:42.454502) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:53:42.454555) EVENT_LOG_v1 {"time_micros": 1769162022454545, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:53:42.454587) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 18506772, prev total WAL file size 18506772, number of live WAL files 2.
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:53:42.458996) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(10MB) 8(1648B)]
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162022459184, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 11456489, "oldest_snapshot_seqno": -1}
Jan 23 09:53:42 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.3 scrub starts
Jan 23 09:53:42 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.3 scrub ok
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3687 keys, 11451043 bytes, temperature: kUnknown
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162022769767, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 11451043, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11421450, "index_size": 19243, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9285, "raw_key_size": 92435, "raw_average_key_size": 25, "raw_value_size": 11349002, "raw_average_value_size": 3078, "num_data_blocks": 850, "num_entries": 3687, "num_filter_entries": 3687, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769162022, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:53:42.770193) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 11451043 bytes
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:53:42.772602) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 36.9 rd, 36.9 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(10.9, 0.0 +0.0 blob) out(10.9 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 3946, records dropped: 259 output_compression: NoCompression
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:53:42.772623) EVENT_LOG_v1 {"time_micros": 1769162022772613, "job": 4, "event": "compaction_finished", "compaction_time_micros": 310747, "compaction_time_cpu_micros": 226693, "output_level": 6, "num_output_files": 1, "total_output_size": 11451043, "num_input_records": 3946, "num_output_records": 3687, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162022774460, "job": 4, "event": "table_file_deletion", "file_number": 14}
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162022774519, "job": 4, "event": "table_file_deletion", "file_number": 8}
Jan 23 09:53:42 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:53:42.458750) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:53:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:42 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c002720 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:43 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:43 compute-2 ceph-mon[75771]: 10.11 scrub starts
Jan 23 09:53:43 compute-2 ceph-mon[75771]: 10.11 scrub ok
Jan 23 09:53:43 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Jan 23 09:53:43 compute-2 ceph-mon[75771]: osdmap e98: 3 total, 3 up, 3 in
Jan 23 09:53:43 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e99 e99: 3 total, 3 up, 3 in
Jan 23 09:53:43 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.14 scrub starts
Jan 23 09:53:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:43 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:43 compute-2 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.14 scrub ok
Jan 23 09:53:44 compute-2 ceph-mon[75771]: 10.3 scrub starts
Jan 23 09:53:44 compute-2 ceph-mon[75771]: 10.3 scrub ok
Jan 23 09:53:44 compute-2 ceph-mon[75771]: pgmap v132: 353 pgs: 2 active+remapped, 351 active+clean; 456 KiB data, 126 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:44 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Jan 23 09:53:44 compute-2 ceph-mon[75771]: osdmap e99: 3 total, 3 up, 3 in
Jan 23 09:53:44 compute-2 ceph-mon[75771]: 10.14 scrub starts
Jan 23 09:53:44 compute-2 ceph-mon[75771]: 10.14 scrub ok
Jan 23 09:53:44 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e100 e100: 3 total, 3 up, 3 in
Jan 23 09:53:44 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 100 pg[10.f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=100) [2] r=0 lpr=100 pi=[82,100)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:44 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 100 pg[10.1f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=100) [2] r=0 lpr=100 pi=[82,100)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:44 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:45 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c002720 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:45 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e101 e101: 3 total, 3 up, 3 in
Jan 23 09:53:45 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 101 pg[10.1f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=101) [2]/[0] r=-1 lpr=101 pi=[82,101)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:45 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 101 pg[10.1f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=101) [2]/[0] r=-1 lpr=101 pi=[82,101)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:45 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 101 pg[10.f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=101) [2]/[0] r=-1 lpr=101 pi=[82,101)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:45 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 101 pg[10.f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=101) [2]/[0] r=-1 lpr=101 pi=[82,101)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:45 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:45 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Jan 23 09:53:45 compute-2 ceph-mon[75771]: osdmap e100: 3 total, 3 up, 3 in
Jan 23 09:53:45 compute-2 ceph-mon[75771]: 10.c scrub starts
Jan 23 09:53:45 compute-2 ceph-mon[75771]: 10.c scrub ok
Jan 23 09:53:45 compute-2 sudo[84533]: pam_unix(sudo:session): session closed for user root
Jan 23 09:53:45 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:53:46 compute-2 systemd[77796]: Starting Mark boot as successful...
Jan 23 09:53:46 compute-2 systemd[77796]: Finished Mark boot as successful.
Jan 23 09:53:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:46 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:47 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:47 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e102 e102: 3 total, 3 up, 3 in
Jan 23 09:53:47 compute-2 ceph-mon[75771]: pgmap v135: 353 pgs: 2 peering, 351 active+clean; 457 KiB data, 126 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 0 objects/s recovering
Jan 23 09:53:47 compute-2 ceph-mon[75771]: osdmap e101: 3 total, 3 up, 3 in
Jan 23 09:53:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:47 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:48 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:49 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:49 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:50 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:53:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:50 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:51 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:51 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:51 compute-2 ceph-mds[83039]: mds.beacon.cephfs.compute-2.prgzmm missed beacon ack from the monitors
Jan 23 09:53:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:52 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:53 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:53 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:53 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e103 e103: 3 total, 3 up, 3 in
Jan 23 09:53:53 compute-2 ceph-mon[75771]: pgmap v137: 353 pgs: 2 peering, 351 active+clean; 457 KiB data, 144 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:53 compute-2 ceph-mon[75771]: osdmap e102: 3 total, 3 up, 3 in
Jan 23 09:53:53 compute-2 ceph-mon[75771]: pgmap v139: 353 pgs: 2 active+remapped, 351 active+clean; 456 KiB data, 144 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:53 compute-2 ceph-mon[75771]: pgmap v140: 353 pgs: 2 active+remapped, 351 active+clean; 456 KiB data, 144 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:53 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Jan 23 09:53:53 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Jan 23 09:53:53 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Jan 23 09:53:53 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 103 pg[10.10( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=103) [2] r=0 lpr=103 pi=[59,103)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:53 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 103 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=0/0 n=5 ec=59/46 lis/c=101/82 les/c/f=102/83/0 sis=103) [2] r=0 lpr=103 pi=[82,103)/1 luod=0'0 crt=62'761 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:53 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 103 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=0/0 n=5 ec=59/46 lis/c=101/82 les/c/f=102/83/0 sis=103) [2] r=0 lpr=103 pi=[82,103)/1 crt=62'761 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:53 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 103 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=0/0 n=7 ec=59/46 lis/c=101/82 les/c/f=102/83/0 sis=103) [2] r=0 lpr=103 pi=[82,103)/1 luod=0'0 crt=62'771 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:53 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 103 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=0/0 n=7 ec=59/46 lis/c=101/82 les/c/f=102/83/0 sis=103) [2] r=0 lpr=103 pi=[82,103)/1 crt=62'771 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:54 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e104 e104: 3 total, 3 up, 3 in
Jan 23 09:53:54 compute-2 ceph-mon[75771]: pgmap v141: 353 pgs: 2 active+remapped, 351 active+clean; 456 KiB data, 144 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:54 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Jan 23 09:53:54 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Jan 23 09:53:54 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Jan 23 09:53:54 compute-2 ceph-mon[75771]: osdmap e103: 3 total, 3 up, 3 in
Jan 23 09:53:54 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 104 pg[10.10( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=104) [2]/[0] r=-1 lpr=104 pi=[59,104)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:54 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 104 pg[10.10( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=104) [2]/[0] r=-1 lpr=104 pi=[59,104)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:54 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 104 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=103/104 n=5 ec=59/46 lis/c=101/82 les/c/f=102/83/0 sis=103) [2] r=0 lpr=103 pi=[82,103)/1 crt=62'761 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:54 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 104 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=103/104 n=7 ec=59/46 lis/c=101/82 les/c/f=102/83/0 sis=103) [2] r=0 lpr=103 pi=[82,103)/1 crt=62'771 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:55 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:55 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:55 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e105 e105: 3 total, 3 up, 3 in
Jan 23 09:53:55 compute-2 ceph-mon[75771]: osdmap e104: 3 total, 3 up, 3 in
Jan 23 09:53:55 compute-2 ceph-mon[75771]: pgmap v144: 353 pgs: 2 active+remapped, 351 active+clean; 456 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 27 B/s, 0 objects/s recovering
Jan 23 09:53:55 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Jan 23 09:53:55 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e105 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:53:56 compute-2 sshd-session[84167]: Connection closed by 192.168.122.30 port 45662
Jan 23 09:53:56 compute-2 sshd-session[84164]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:53:56 compute-2 systemd[1]: session-36.scope: Deactivated successfully.
Jan 23 09:53:56 compute-2 systemd[1]: session-36.scope: Consumed 9.093s CPU time.
Jan 23 09:53:56 compute-2 systemd-logind[786]: Session 36 logged out. Waiting for processes to exit.
Jan 23 09:53:56 compute-2 systemd-logind[786]: Removed session 36.
Jan 23 09:53:56 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e106 e106: 3 total, 3 up, 3 in
Jan 23 09:53:56 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 106 pg[10.10( v 58'754 (0'0,58'754] local-lis/les=0/0 n=2 ec=59/46 lis/c=104/59 les/c/f=105/60/0 sis=106) [2] r=0 lpr=106 pi=[59,106)/1 luod=0'0 crt=58'754 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:56 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 106 pg[10.10( v 58'754 (0'0,58'754] local-lis/les=0/0 n=2 ec=59/46 lis/c=104/59 les/c/f=105/60/0 sis=106) [2] r=0 lpr=106 pi=[59,106)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:56 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Jan 23 09:53:56 compute-2 ceph-mon[75771]: osdmap e105: 3 total, 3 up, 3 in
Jan 23 09:53:56 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:56 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:56 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:56 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:56 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:56 compute-2 ceph-mon[75771]: Deploying daemon haproxy.rgw.default.compute-0.qabsws on compute-0
Jan 23 09:53:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:56 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:57 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:57 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:57 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e107 e107: 3 total, 3 up, 3 in
Jan 23 09:53:57 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 107 pg[10.10( v 58'754 (0'0,58'754] local-lis/les=106/107 n=2 ec=59/46 lis/c=104/59 les/c/f=105/60/0 sis=106) [2] r=0 lpr=106 pi=[59,106)/1 crt=58'754 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:57 compute-2 ceph-mon[75771]: osdmap e106: 3 total, 3 up, 3 in
Jan 23 09:53:57 compute-2 ceph-mon[75771]: pgmap v147: 353 pgs: 353 active+clean; 456 KiB data, 144 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:57 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Jan 23 09:53:57 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Jan 23 09:53:58 compute-2 sudo[84609]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:53:58 compute-2 sudo[84609]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:53:58 compute-2 sudo[84609]: pam_unix(sudo:session): session closed for user root
Jan 23 09:53:58 compute-2 sudo[84634]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/haproxy:2.3 --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:53:58 compute-2 sudo[84634]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:53:58 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 107 pg[10.12( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=107) [2] r=0 lpr=107 pi=[67,107)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:58 compute-2 podman[84701]: 2026-01-23 09:53:58.715158748 +0000 UTC m=+0.047490704 container create 924e5abd11de385a333b10a1167f3684d7ebea6c25638e8382d0a50af0ae4414 (image=quay.io/ceph/haproxy:2.3, name=inspiring_carver)
Jan 23 09:53:58 compute-2 systemd[1]: Started libpod-conmon-924e5abd11de385a333b10a1167f3684d7ebea6c25638e8382d0a50af0ae4414.scope.
Jan 23 09:53:58 compute-2 systemd[1]: Started libcrun container.
Jan 23 09:53:58 compute-2 podman[84701]: 2026-01-23 09:53:58.696130132 +0000 UTC m=+0.028462108 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 23 09:53:58 compute-2 ceph-mon[75771]: osdmap e107: 3 total, 3 up, 3 in
Jan 23 09:53:58 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:58 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:58 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:58 compute-2 ceph-mon[75771]: Deploying daemon haproxy.rgw.default.compute-2.izjwnk on compute-2
Jan 23 09:53:58 compute-2 podman[84701]: 2026-01-23 09:53:58.804553103 +0000 UTC m=+0.136885069 container init 924e5abd11de385a333b10a1167f3684d7ebea6c25638e8382d0a50af0ae4414 (image=quay.io/ceph/haproxy:2.3, name=inspiring_carver)
Jan 23 09:53:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:58 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:58 compute-2 podman[84701]: 2026-01-23 09:53:58.81381672 +0000 UTC m=+0.146148666 container start 924e5abd11de385a333b10a1167f3684d7ebea6c25638e8382d0a50af0ae4414 (image=quay.io/ceph/haproxy:2.3, name=inspiring_carver)
Jan 23 09:53:58 compute-2 podman[84701]: 2026-01-23 09:53:58.818558422 +0000 UTC m=+0.150890468 container attach 924e5abd11de385a333b10a1167f3684d7ebea6c25638e8382d0a50af0ae4414 (image=quay.io/ceph/haproxy:2.3, name=inspiring_carver)
Jan 23 09:53:58 compute-2 inspiring_carver[84717]: 0 0
Jan 23 09:53:58 compute-2 systemd[1]: libpod-924e5abd11de385a333b10a1167f3684d7ebea6c25638e8382d0a50af0ae4414.scope: Deactivated successfully.
Jan 23 09:53:58 compute-2 conmon[84717]: conmon 924e5abd11de385a333b <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-924e5abd11de385a333b10a1167f3684d7ebea6c25638e8382d0a50af0ae4414.scope/container/memory.events
Jan 23 09:53:58 compute-2 podman[84701]: 2026-01-23 09:53:58.821987902 +0000 UTC m=+0.154319848 container died 924e5abd11de385a333b10a1167f3684d7ebea6c25638e8382d0a50af0ae4414 (image=quay.io/ceph/haproxy:2.3, name=inspiring_carver)
Jan 23 09:53:58 compute-2 systemd[1]: var-lib-containers-storage-overlay-6772238725dedc4f7395fc6b6d506a03e7015afb4a207bd32baa989f5a107db7-merged.mount: Deactivated successfully.
Jan 23 09:53:58 compute-2 podman[84701]: 2026-01-23 09:53:58.866946675 +0000 UTC m=+0.199278621 container remove 924e5abd11de385a333b10a1167f3684d7ebea6c25638e8382d0a50af0ae4414 (image=quay.io/ceph/haproxy:2.3, name=inspiring_carver)
Jan 23 09:53:58 compute-2 systemd[1]: libpod-conmon-924e5abd11de385a333b10a1167f3684d7ebea6c25638e8382d0a50af0ae4414.scope: Deactivated successfully.
Jan 23 09:53:58 compute-2 systemd[1]: Reloading.
Jan 23 09:53:59 compute-2 systemd-rc-local-generator[84762]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:53:59 compute-2 systemd-sysv-generator[84765]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:53:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:59 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6854000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:59 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e108 e108: 3 total, 3 up, 3 in
Jan 23 09:53:59 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 108 pg[10.12( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=108) [2]/[1] r=-1 lpr=108 pi=[67,108)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:59 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 108 pg[10.12( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=108) [2]/[1] r=-1 lpr=108 pi=[67,108)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:59 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 108 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=108 pruub=12.026422501s) [1] r=-1 lpr=108 pi=[66,108)/1 crt=62'759 mlcod 0'0 active pruub 129.126800537s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:59 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 108 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=108 pruub=12.026377678s) [1] r=-1 lpr=108 pi=[66,108)/1 crt=62'759 mlcod 0'0 unknown NOTIFY pruub 129.126800537s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:59 compute-2 systemd[1]: Reloading.
Jan 23 09:53:59 compute-2 systemd-rc-local-generator[84804]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:53:59 compute-2 systemd-sysv-generator[84808]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:53:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:53:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.004000091s ======
Jan 23 09:53:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:59.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000091s
Jan 23 09:53:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:59 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:59 compute-2 systemd[1]: Starting Ceph haproxy.rgw.default.compute-2.izjwnk for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:53:59 compute-2 podman[84861]: 2026-01-23 09:53:59.758323588 +0000 UTC m=+0.041665028 container create c0d2fdbe15736053ab1c3c44f6e122e8a046bd96739d6c407e37736b0b1b24d0 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-rgw-default-compute-2-izjwnk)
Jan 23 09:53:59 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73775bdafb1314bd2f9de495c4be70618a2025b8e05554445d1ca65eb394d1b9/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Jan 23 09:53:59 compute-2 podman[84861]: 2026-01-23 09:53:59.808172416 +0000 UTC m=+0.091513866 container init c0d2fdbe15736053ab1c3c44f6e122e8a046bd96739d6c407e37736b0b1b24d0 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-rgw-default-compute-2-izjwnk)
Jan 23 09:53:59 compute-2 ceph-mon[75771]: pgmap v149: 353 pgs: 353 active+clean; 456 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 459 B/s rd, 0 op/s
Jan 23 09:53:59 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Jan 23 09:53:59 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Jan 23 09:53:59 compute-2 ceph-mon[75771]: osdmap e108: 3 total, 3 up, 3 in
Jan 23 09:53:59 compute-2 podman[84861]: 2026-01-23 09:53:59.813107832 +0000 UTC m=+0.096449272 container start c0d2fdbe15736053ab1c3c44f6e122e8a046bd96739d6c407e37736b0b1b24d0 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-rgw-default-compute-2-izjwnk)
Jan 23 09:53:59 compute-2 bash[84861]: c0d2fdbe15736053ab1c3c44f6e122e8a046bd96739d6c407e37736b0b1b24d0
Jan 23 09:53:59 compute-2 podman[84861]: 2026-01-23 09:53:59.740627913 +0000 UTC m=+0.023969373 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 23 09:53:59 compute-2 systemd[1]: Started Ceph haproxy.rgw.default.compute-2.izjwnk for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:53:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-rgw-default-compute-2-izjwnk[84876]: [NOTICE] 022/095359 (2) : New worker #1 (4) forked
Jan 23 09:53:59 compute-2 sudo[84634]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:00 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e109 e109: 3 total, 3 up, 3 in
Jan 23 09:54:00 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 109 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=109) [1]/[2] r=0 lpr=109 pi=[66,109)/1 crt=62'759 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:54:00 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 109 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=109) [1]/[2] r=0 lpr=109 pi=[66,109)/1 crt=62'759 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:54:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095400 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 09:54:00 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:54:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:54:00 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:00 compute-2 ceph-mon[75771]: Health check failed: 1 OSD(s) experiencing slow operations in BlueStore (BLUESTORE_SLOW_OP_ALERT)
Jan 23 09:54:00 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:00 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:00 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:00 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:00 compute-2 ceph-mon[75771]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 23 09:54:00 compute-2 ceph-mon[75771]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 23 09:54:00 compute-2 ceph-mon[75771]: Deploying daemon keepalived.rgw.default.compute-0.tytkrd on compute-0
Jan 23 09:54:00 compute-2 ceph-mon[75771]: osdmap e109: 3 total, 3 up, 3 in
Jan 23 09:54:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:54:01 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:01 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e110 e110: 3 total, 3 up, 3 in
Jan 23 09:54:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:01.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:01 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 110 pg[10.12( v 61'760 (0'0,61'760] local-lis/les=0/0 n=4 ec=59/46 lis/c=108/67 les/c/f=109/68/0 sis=110) [2] r=0 lpr=110 pi=[67,110)/1 luod=0'0 crt=61'760 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:54:01 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 110 pg[10.12( v 61'760 (0'0,61'760] local-lis/les=0/0 n=4 ec=59/46 lis/c=108/67 les/c/f=109/68/0 sis=110) [2] r=0 lpr=110 pi=[67,110)/1 crt=61'760 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:54:01 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 110 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=109/110 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=109) [1]/[2] async=[1] r=0 lpr=109 pi=[66,109)/1 crt=62'759 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:54:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:01.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:54:01 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6854001d70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:01 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:01 compute-2 ceph-mon[75771]: pgmap v152: 353 pgs: 1 remapped+peering, 352 active+clean; 456 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 467 B/s rd, 0 op/s; 0 B/s, 0 objects/s recovering
Jan 23 09:54:01 compute-2 ceph-mon[75771]: osdmap e110: 3 total, 3 up, 3 in
Jan 23 09:54:02 compute-2 sudo[84891]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:54:02 compute-2 sudo[84891]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:02 compute-2 sudo[84891]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:02 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e111 e111: 3 total, 3 up, 3 in
Jan 23 09:54:02 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 111 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=109/110 n=5 ec=59/46 lis/c=109/66 les/c/f=110/67/0 sis=111 pruub=14.992680550s) [1] async=[1] r=-1 lpr=111 pi=[66,111)/1 crt=62'759 mlcod 62'759 active pruub 135.129196167s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:54:02 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 111 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=109/110 n=5 ec=59/46 lis/c=109/66 les/c/f=110/67/0 sis=111 pruub=14.992507935s) [1] r=-1 lpr=111 pi=[66,111)/1 crt=62'759 mlcod 0'0 unknown NOTIFY pruub 135.129196167s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:54:02 compute-2 sudo[84916]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/keepalived:2.2.4 --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:54:02 compute-2 sudo[84916]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:02 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 111 pg[10.12( v 61'760 (0'0,61'760] local-lis/les=110/111 n=4 ec=59/46 lis/c=108/67 les/c/f=109/68/0 sis=110) [2] r=0 lpr=110 pi=[67,110)/1 crt=61'760 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:54:02 compute-2 podman[84982]: 2026-01-23 09:54:02.60883274 +0000 UTC m=+0.040942471 container create 3f5566e36fa0d56d11b5a23c744c2017dc2977a64b2ba140c154bddcb5386e8e (image=quay.io/ceph/keepalived:2.2.4, name=gallant_lamport, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, name=keepalived, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.expose-services=, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, description=keepalived for Ceph, distribution-scope=public, vcs-type=git, release=1793)
Jan 23 09:54:02 compute-2 systemd[1]: Started libpod-conmon-3f5566e36fa0d56d11b5a23c744c2017dc2977a64b2ba140c154bddcb5386e8e.scope.
Jan 23 09:54:02 compute-2 systemd[1]: Started libcrun container.
Jan 23 09:54:02 compute-2 podman[84982]: 2026-01-23 09:54:02.590273225 +0000 UTC m=+0.022382976 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 23 09:54:02 compute-2 podman[84982]: 2026-01-23 09:54:02.699053084 +0000 UTC m=+0.131162835 container init 3f5566e36fa0d56d11b5a23c744c2017dc2977a64b2ba140c154bddcb5386e8e (image=quay.io/ceph/keepalived:2.2.4, name=gallant_lamport, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, name=keepalived, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793)
Jan 23 09:54:02 compute-2 podman[84982]: 2026-01-23 09:54:02.706161651 +0000 UTC m=+0.138271382 container start 3f5566e36fa0d56d11b5a23c744c2017dc2977a64b2ba140c154bddcb5386e8e (image=quay.io/ceph/keepalived:2.2.4, name=gallant_lamport, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, version=2.2.4, io.openshift.tags=Ceph keepalived, io.openshift.expose-services=, name=keepalived, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, description=keepalived for Ceph)
Jan 23 09:54:02 compute-2 podman[84982]: 2026-01-23 09:54:02.70996336 +0000 UTC m=+0.142073121 container attach 3f5566e36fa0d56d11b5a23c744c2017dc2977a64b2ba140c154bddcb5386e8e (image=quay.io/ceph/keepalived:2.2.4, name=gallant_lamport, io.buildah.version=1.28.2, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, architecture=x86_64, vcs-type=git, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, name=keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, vendor=Red Hat, Inc.)
Jan 23 09:54:02 compute-2 gallant_lamport[84998]: 0 0
Jan 23 09:54:02 compute-2 systemd[1]: libpod-3f5566e36fa0d56d11b5a23c744c2017dc2977a64b2ba140c154bddcb5386e8e.scope: Deactivated successfully.
Jan 23 09:54:02 compute-2 conmon[84998]: conmon 3f5566e36fa0d56d11b5 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-3f5566e36fa0d56d11b5a23c744c2017dc2977a64b2ba140c154bddcb5386e8e.scope/container/memory.events
Jan 23 09:54:02 compute-2 podman[84982]: 2026-01-23 09:54:02.7154741 +0000 UTC m=+0.147583821 container died 3f5566e36fa0d56d11b5a23c744c2017dc2977a64b2ba140c154bddcb5386e8e (image=quay.io/ceph/keepalived:2.2.4, name=gallant_lamport, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, vcs-type=git, release=1793, version=2.2.4, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 23 09:54:02 compute-2 systemd[1]: var-lib-containers-storage-overlay-98917a16612cec2a1fa02256d3a7e0a6490ab36aade52f28bc04c21f202e332b-merged.mount: Deactivated successfully.
Jan 23 09:54:02 compute-2 podman[84982]: 2026-01-23 09:54:02.75983384 +0000 UTC m=+0.191943571 container remove 3f5566e36fa0d56d11b5a23c744c2017dc2977a64b2ba140c154bddcb5386e8e (image=quay.io/ceph/keepalived:2.2.4, name=gallant_lamport, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived)
Jan 23 09:54:02 compute-2 systemd[1]: libpod-conmon-3f5566e36fa0d56d11b5a23c744c2017dc2977a64b2ba140c154bddcb5386e8e.scope: Deactivated successfully.
Jan 23 09:54:02 compute-2 systemd[1]: Reloading.
Jan 23 09:54:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:54:02 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:02 compute-2 systemd-sysv-generator[85050]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:54:02 compute-2 systemd-rc-local-generator[85040]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:54:03 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:03 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:03 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:03 compute-2 ceph-mon[75771]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 23 09:54:03 compute-2 ceph-mon[75771]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 23 09:54:03 compute-2 ceph-mon[75771]: Deploying daemon keepalived.rgw.default.compute-2.qpmsjd on compute-2
Jan 23 09:54:03 compute-2 ceph-mon[75771]: osdmap e111: 3 total, 3 up, 3 in
Jan 23 09:54:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:54:03 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828003c10 fd 39 proxy ignored for local
Jan 23 09:54:03 compute-2 kernel: ganesha.nfsd[83423]: segfault at 50 ip 00007f68d4c9f32e sp 00007f683dffa210 error 4 in libntirpc.so.5.8[7f68d4c84000+2c000] likely on CPU 0 (core 0, socket 0)
Jan 23 09:54:03 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 09:54:03 compute-2 systemd[1]: Created slice Slice /system/systemd-coredump.
Jan 23 09:54:03 compute-2 systemd[1]: Started Process Core Dump (PID 85055/UID 0).
Jan 23 09:54:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 09:54:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:03.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 09:54:03 compute-2 systemd[1]: Reloading.
Jan 23 09:54:03 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e112 e112: 3 total, 3 up, 3 in
Jan 23 09:54:03 compute-2 systemd-rc-local-generator[85089]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:54:03 compute-2 systemd-sysv-generator[85094]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:54:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:03.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:03 compute-2 systemd[1]: Starting Ceph keepalived.rgw.default.compute-2.qpmsjd for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:54:03 compute-2 podman[85150]: 2026-01-23 09:54:03.72194761 +0000 UTC m=+0.052456781 container create 4079f34468022a0cd827d5befc6f77a5139d567959582991ace2a2e3465960c7 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd, name=keepalived, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, release=1793, io.buildah.version=1.28.2, distribution-scope=public, io.openshift.expose-services=, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Jan 23 09:54:03 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c3c97516a3697cea5613434af93e4104ed25eb6e4dd09966eed88fc66d25fc3/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:54:03 compute-2 podman[85150]: 2026-01-23 09:54:03.784577937 +0000 UTC m=+0.115087108 container init 4079f34468022a0cd827d5befc6f77a5139d567959582991ace2a2e3465960c7 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd, release=1793, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, name=keepalived, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, vcs-type=git, vendor=Red Hat, Inc.)
Jan 23 09:54:03 compute-2 podman[85150]: 2026-01-23 09:54:03.698641043 +0000 UTC m=+0.029150234 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 23 09:54:03 compute-2 podman[85150]: 2026-01-23 09:54:03.807072525 +0000 UTC m=+0.137581716 container start 4079f34468022a0cd827d5befc6f77a5139d567959582991ace2a2e3465960c7 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, description=keepalived for Ceph, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, release=1793, distribution-scope=public, name=keepalived, vcs-type=git, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, io.buildah.version=1.28.2, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Jan 23 09:54:03 compute-2 bash[85150]: 4079f34468022a0cd827d5befc6f77a5139d567959582991ace2a2e3465960c7
Jan 23 09:54:03 compute-2 systemd[1]: Started Ceph keepalived.rgw.default.compute-2.qpmsjd for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:54:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:03 2026: Starting Keepalived v2.2.4 (08/21,2021)
Jan 23 09:54:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:03 2026: Running on Linux 5.14.0-661.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026 (built for Linux 5.14.0)
Jan 23 09:54:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:03 2026: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Jan 23 09:54:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:03 2026: Configuration file /etc/keepalived/keepalived.conf
Jan 23 09:54:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:03 2026: Failed to bind to process monitoring socket - errno 98 - Address already in use
Jan 23 09:54:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:03 2026: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Jan 23 09:54:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:03 2026: Starting VRRP child process, pid=4
Jan 23 09:54:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:03 2026: (VI_0) Entering BACKUP STATE (init)
Jan 23 09:54:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:03 2026: Startup complete
Jan 23 09:54:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:03 2026: VRRP_Script(check_backend) succeeded
Jan 23 09:54:03 compute-2 sudo[84916]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:04 compute-2 ceph-mon[75771]: pgmap v155: 353 pgs: 1 remapped+peering, 352 active+clean; 456 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 0 objects/s recovering
Jan 23 09:54:04 compute-2 ceph-mon[75771]: osdmap e112: 3 total, 3 up, 3 in
Jan 23 09:54:04 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:04 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:04 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:04 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:04 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 09:54:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:05.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 09:54:05 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e113 e113: 3 total, 3 up, 3 in
Jan 23 09:54:05 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 113 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=73/74 n=5 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=113 pruub=11.421886444s) [1] r=-1 lpr=113 pi=[73,113)/1 crt=62'771 mlcod 0'0 active pruub 134.621353149s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:54:05 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 113 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=73/74 n=5 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=113 pruub=11.421828270s) [1] r=-1 lpr=113 pi=[73,113)/1 crt=62'771 mlcod 0'0 unknown NOTIFY pruub 134.621353149s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:54:05 compute-2 ceph-mon[75771]: Deploying daemon prometheus.compute-0 on compute-0
Jan 23 09:54:05 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Jan 23 09:54:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:05.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:05 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:05 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:05 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e113 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:54:06 compute-2 systemd-coredump[85058]: Process 83370 (ganesha.nfsd) of user 0 dumped core.
                                                   
                                                   Stack trace of thread 53:
                                                   #0  0x00007f68d4c9f32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                   ELF object binary architecture: AMD x86-64
Jan 23 09:54:06 compute-2 systemd[1]: systemd-coredump@0-85055-0.service: Deactivated successfully.
Jan 23 09:54:06 compute-2 systemd[1]: systemd-coredump@0-85055-0.service: Consumed 2.925s CPU time.
Jan 23 09:54:06 compute-2 podman[85178]: 2026-01-23 09:54:06.234439932 +0000 UTC m=+0.031010408 container died 25dd11a4fcdbe97d844db7d4fb971576b159944b07e8d21ef2be7d36d99ebd7d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Jan 23 09:54:06 compute-2 systemd[1]: var-lib-containers-storage-overlay-2d0f8416f7052c607630e33d06ff2a3ec2436d092e8cfcdd013926939d221c79-merged.mount: Deactivated successfully.
Jan 23 09:54:06 compute-2 ceph-mon[75771]: pgmap v157: 353 pgs: 353 active+clean; 456 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 22 B/s, 1 objects/s recovering
Jan 23 09:54:06 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Jan 23 09:54:06 compute-2 ceph-mon[75771]: osdmap e113: 3 total, 3 up, 3 in
Jan 23 09:54:06 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:06 compute-2 podman[85178]: 2026-01-23 09:54:06.278073787 +0000 UTC m=+0.074644243 container remove 25dd11a4fcdbe97d844db7d4fb971576b159944b07e8d21ef2be7d36d99ebd7d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:54:06 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e114 e114: 3 total, 3 up, 3 in
Jan 23 09:54:06 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Main process exited, code=exited, status=139/n/a
Jan 23 09:54:06 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 114 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=73/74 n=5 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=114) [1]/[2] r=0 lpr=114 pi=[73,114)/1 crt=62'771 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:54:06 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 114 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=73/74 n=5 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=114) [1]/[2] r=0 lpr=114 pi=[73,114)/1 crt=62'771 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:54:06 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 09:54:06 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 2.236s CPU time.
Jan 23 09:54:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:06 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:06 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 09:54:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:07.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 09:54:07 compute-2 ceph-mon[75771]: osdmap e114: 3 total, 3 up, 3 in
Jan 23 09:54:07 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Jan 23 09:54:07 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e115 e115: 3 total, 3 up, 3 in
Jan 23 09:54:07 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 115 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=114/115 n=5 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=114) [1]/[2] async=[1] r=0 lpr=114 pi=[73,114)/1 crt=62'771 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:54:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:07.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:07 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:07 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:08 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:08 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:08 compute-2 ceph-mon[75771]: pgmap v160: 353 pgs: 353 active+clean; 456 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 44 B/s, 1 objects/s recovering
Jan 23 09:54:08 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Jan 23 09:54:08 compute-2 ceph-mon[75771]: osdmap e115: 3 total, 3 up, 3 in
Jan 23 09:54:08 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e116 e116: 3 total, 3 up, 3 in
Jan 23 09:54:08 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 116 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=114/115 n=5 ec=59/46 lis/c=114/73 les/c/f=115/74/0 sis=116 pruub=14.343958855s) [1] async=[1] r=-1 lpr=116 pi=[73,116)/1 crt=62'771 mlcod 62'771 active pruub 141.281097412s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:54:08 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 116 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=114/115 n=5 ec=59/46 lis/c=114/73 les/c/f=115/74/0 sis=116 pruub=14.343770981s) [1] r=-1 lpr=116 pi=[73,116)/1 crt=62'771 mlcod 0'0 unknown NOTIFY pruub 141.281097412s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:54:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 09:54:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:09.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 09:54:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:09.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:09 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:09 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:10 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e117 e117: 3 total, 3 up, 3 in
Jan 23 09:54:10 compute-2 ceph-mon[75771]: osdmap e116: 3 total, 3 up, 3 in
Jan 23 09:54:10 compute-2 ceph-mon[75771]: pgmap v163: 353 pgs: 1 peering, 352 active+clean; 456 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 255 B/s wr, 1 op/s; 27 B/s, 2 objects/s recovering
Jan 23 09:54:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:10 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:10 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:10 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e117 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:54:11 compute-2 ceph-mon[75771]: osdmap e117: 3 total, 3 up, 3 in
Jan 23 09:54:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095411 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 09:54:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:11.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:11.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:11 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:11 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:12 compute-2 sshd-session[85227]: Accepted publickey for zuul from 192.168.122.30 port 59756 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:54:12 compute-2 systemd-logind[786]: New session 37 of user zuul.
Jan 23 09:54:12 compute-2 systemd[1]: Started Session 37 of User zuul.
Jan 23 09:54:12 compute-2 sshd-session[85227]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:54:12 compute-2 ceph-mon[75771]: pgmap v165: 353 pgs: 1 peering, 352 active+clean; 456 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 853 B/s rd, 426 B/s wr, 1 op/s; 0 B/s, 1 objects/s recovering
Jan 23 09:54:12 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:12 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:12 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:12 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch
Jan 23 09:54:12 compute-2 ceph-mgr[76120]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 23 09:54:12 compute-2 ceph-mgr[76120]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 23 09:54:12 compute-2 ceph-mgr[76120]: mgr respawn  0: '/usr/bin/ceph-mgr'
Jan 23 09:54:12 compute-2 ceph-mgr[76120]: mgr respawn  1: '-n'
Jan 23 09:54:12 compute-2 ceph-mgr[76120]: mgr respawn  2: 'mgr.compute-2.uczrot'
Jan 23 09:54:12 compute-2 ceph-mgr[76120]: mgr respawn  3: '-f'
Jan 23 09:54:12 compute-2 ceph-mgr[76120]: mgr respawn  4: '--setuser'
Jan 23 09:54:12 compute-2 ceph-mgr[76120]: mgr respawn  5: 'ceph'
Jan 23 09:54:12 compute-2 ceph-mgr[76120]: mgr respawn  6: '--setgroup'
Jan 23 09:54:12 compute-2 ceph-mgr[76120]: mgr respawn  7: 'ceph'
Jan 23 09:54:12 compute-2 ceph-mgr[76120]: mgr respawn  8: '--default-log-to-file=false'
Jan 23 09:54:12 compute-2 ceph-mgr[76120]: mgr respawn  9: '--default-log-to-journald=true'
Jan 23 09:54:12 compute-2 ceph-mgr[76120]: mgr respawn  10: '--default-log-to-stderr=false'
Jan 23 09:54:12 compute-2 ceph-mgr[76120]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Jan 23 09:54:12 compute-2 ceph-mgr[76120]: mgr respawn  exe_path /proc/self/exe
Jan 23 09:54:12 compute-2 sshd-session[77811]: Connection closed by 192.168.122.100 port 35598
Jan 23 09:54:12 compute-2 sshd-session[77792]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:54:12 compute-2 systemd[1]: session-34.scope: Deactivated successfully.
Jan 23 09:54:12 compute-2 systemd[1]: session-34.scope: Consumed 30.610s CPU time.
Jan 23 09:54:12 compute-2 systemd-logind[786]: Session 34 logged out. Waiting for processes to exit.
Jan 23 09:54:12 compute-2 systemd-logind[786]: Removed session 34.
Jan 23 09:54:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: ignoring --setuser ceph since I am not root
Jan 23 09:54:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: ignoring --setgroup ceph since I am not root
Jan 23 09:54:12 compute-2 ceph-mgr[76120]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 23 09:54:12 compute-2 ceph-mgr[76120]: pidfile_write: ignore empty --pid-file
Jan 23 09:54:12 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'alerts'
Jan 23 09:54:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:12.553+0000 7f4b4bc8d140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 09:54:12 compute-2 ceph-mgr[76120]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 09:54:12 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'balancer'
Jan 23 09:54:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:12 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:12 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:12.669+0000 7f4b4bc8d140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 09:54:12 compute-2 ceph-mgr[76120]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 09:54:12 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'cephadm'
Jan 23 09:54:12 compute-2 python3.9[85401]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 23 09:54:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:13.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:13.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:13 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'crash'
Jan 23 09:54:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:13 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:13 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:13.663+0000 7f4b4bc8d140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 09:54:13 compute-2 ceph-mgr[76120]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 09:54:13 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'dashboard'
Jan 23 09:54:14 compute-2 python3.9[85586]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:54:14 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'devicehealth'
Jan 23 09:54:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:14.490+0000 7f4b4bc8d140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 09:54:14 compute-2 ceph-mgr[76120]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 09:54:14 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'diskprediction_local'
Jan 23 09:54:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:14 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:14 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 23 09:54:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 23 09:54:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]:   from numpy import show_config as show_numpy_config
Jan 23 09:54:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:14.722+0000 7f4b4bc8d140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 09:54:14 compute-2 ceph-mgr[76120]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 09:54:14 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'influx'
Jan 23 09:54:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:14.802+0000 7f4b4bc8d140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 09:54:14 compute-2 ceph-mgr[76120]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 09:54:14 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'insights'
Jan 23 09:54:14 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'iostat'
Jan 23 09:54:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:14.951+0000 7f4b4bc8d140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 09:54:14 compute-2 ceph-mgr[76120]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 09:54:14 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'k8sevents'
Jan 23 09:54:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 09:54:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:15.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 09:54:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:15.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:15 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'localpool'
Jan 23 09:54:15 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'mds_autoscaler'
Jan 23 09:54:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:15 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:15 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:15 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e117 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:54:15 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'mirroring'
Jan 23 09:54:15 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'nfs'
Jan 23 09:54:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:16.156+0000 7f4b4bc8d140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 09:54:16 compute-2 ceph-mgr[76120]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 09:54:16 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'orchestrator'
Jan 23 09:54:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:16.427+0000 7f4b4bc8d140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 09:54:16 compute-2 ceph-mgr[76120]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 09:54:16 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'osd_perf_query'
Jan 23 09:54:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:16.514+0000 7f4b4bc8d140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 09:54:16 compute-2 ceph-mgr[76120]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 09:54:16 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'osd_support'
Jan 23 09:54:16 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Scheduled restart job, restart counter is at 1.
Jan 23 09:54:16 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:54:16 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 2.236s CPU time.
Jan 23 09:54:16 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:54:16 compute-2 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished
Jan 23 09:54:16 compute-2 ceph-mon[75771]: mgrmap e26: compute-0.nbdygh(active, since 2m), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.590021) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162056590296, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 1003, "num_deletes": 251, "total_data_size": 2240482, "memory_usage": 2276192, "flush_reason": "Manual Compaction"}
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162056603909, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 1428848, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7512, "largest_seqno": 8510, "table_properties": {"data_size": 1424071, "index_size": 2301, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 10217, "raw_average_key_size": 18, "raw_value_size": 1414138, "raw_average_value_size": 2566, "num_data_blocks": 102, "num_entries": 551, "num_filter_entries": 551, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162023, "oldest_key_time": 1769162023, "file_creation_time": 1769162056, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 13925 microseconds, and 7524 cpu microseconds.
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.604024) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 1428848 bytes OK
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.604066) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.606036) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.606065) EVENT_LOG_v1 {"time_micros": 1769162056606060, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.606092) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 2235274, prev total WAL file size 2235274, number of live WAL files 2.
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.607893) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323532' seq:0, type:0; will stop at (end)
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(1395KB)], [15(10MB)]
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162056608130, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 12879891, "oldest_snapshot_seqno": -1}
Jan 23 09:54:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:16.610+0000 7f4b4bc8d140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 09:54:16 compute-2 ceph-mgr[76120]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 09:54:16 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'pg_autoscaler'
Jan 23 09:54:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:16 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:16 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:16.715+0000 7f4b4bc8d140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 09:54:16 compute-2 ceph-mgr[76120]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 09:54:16 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'progress'
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3711 keys, 12440754 bytes, temperature: kUnknown
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162056721302, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 12440754, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12410095, "index_size": 20309, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9285, "raw_key_size": 94976, "raw_average_key_size": 25, "raw_value_size": 12336134, "raw_average_value_size": 3324, "num_data_blocks": 879, "num_entries": 3711, "num_filter_entries": 3711, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769162056, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.721590) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 12440754 bytes
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.723256) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 113.7 rd, 109.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 10.9 +0.0 blob) out(11.9 +0.0 blob), read-write-amplify(17.7) write-amplify(8.7) OK, records in: 4238, records dropped: 527 output_compression: NoCompression
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.723277) EVENT_LOG_v1 {"time_micros": 1769162056723267, "job": 6, "event": "compaction_finished", "compaction_time_micros": 113237, "compaction_time_cpu_micros": 39837, "output_level": 6, "num_output_files": 1, "total_output_size": 12440754, "num_input_records": 4238, "num_output_records": 3711, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162056723560, "job": 6, "event": "table_file_deletion", "file_number": 17}
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162056725314, "job": 6, "event": "table_file_deletion", "file_number": 15}
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.607012) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.725685) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.725691) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.725693) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.725694) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:54:16 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.725696) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:54:16 compute-2 podman[85715]: 2026-01-23 09:54:16.802754113 +0000 UTC m=+0.043612065 container create 2988332eba883a20d1059e3b3728d769fb16998f721e57e41af0869ab3b7663e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2)
Jan 23 09:54:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:16.814+0000 7f4b4bc8d140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 09:54:16 compute-2 ceph-mgr[76120]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 09:54:16 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'prometheus'
Jan 23 09:54:16 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a90f8a234e18b05e243a2b45741ab580a2f24f36b6337b5ad8040626fe6cbe4d/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 09:54:16 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a90f8a234e18b05e243a2b45741ab580a2f24f36b6337b5ad8040626fe6cbe4d/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 09:54:16 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a90f8a234e18b05e243a2b45741ab580a2f24f36b6337b5ad8040626fe6cbe4d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:54:16 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a90f8a234e18b05e243a2b45741ab580a2f24f36b6337b5ad8040626fe6cbe4d/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.tykohi-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 09:54:16 compute-2 podman[85715]: 2026-01-23 09:54:16.871433248 +0000 UTC m=+0.112291220 container init 2988332eba883a20d1059e3b3728d769fb16998f721e57e41af0869ab3b7663e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 09:54:16 compute-2 podman[85715]: 2026-01-23 09:54:16.876567866 +0000 UTC m=+0.117425818 container start 2988332eba883a20d1059e3b3728d769fb16998f721e57e41af0869ab3b7663e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 09:54:16 compute-2 podman[85715]: 2026-01-23 09:54:16.781342555 +0000 UTC m=+0.022200527 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:54:16 compute-2 bash[85715]: 2988332eba883a20d1059e3b3728d769fb16998f721e57e41af0869ab3b7663e
Jan 23 09:54:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:16 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 09:54:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:16 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 09:54:16 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:54:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:16 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 09:54:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:16 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 09:54:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:16 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 09:54:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:16 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 09:54:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:16 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 09:54:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:16 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:54:17 compute-2 sudo[85845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arzrbvkbwlwbjgrnwigsrmrghfceytpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162056.6460075-90-44868834025739/AnsiballZ_command.py'
Jan 23 09:54:17 compute-2 sudo[85845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:54:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:17.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:17.284+0000 7f4b4bc8d140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 09:54:17 compute-2 ceph-mgr[76120]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 09:54:17 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'rbd_support'
Jan 23 09:54:17 compute-2 python3.9[85847]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:54:17 compute-2 sudo[85845]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:17.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:17.426+0000 7f4b4bc8d140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 09:54:17 compute-2 ceph-mgr[76120]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 09:54:17 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'restful'
Jan 23 09:54:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:17 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:17 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:17 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'rgw'
Jan 23 09:54:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:17.965+0000 7f4b4bc8d140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 09:54:17 compute-2 ceph-mgr[76120]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 09:54:17 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'rook'
Jan 23 09:54:18 compute-2 sudo[86000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epodtqwoxgdltrkacwxmiwtbgtjsgstf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162057.845022-126-155682831472493/AnsiballZ_stat.py'
Jan 23 09:54:18 compute-2 sudo[86000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:54:18 compute-2 python3.9[86002]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:54:18 compute-2 sudo[86000]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:18 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:18 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:18.730+0000 7f4b4bc8d140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 09:54:18 compute-2 ceph-mgr[76120]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 09:54:18 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'selftest'
Jan 23 09:54:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:18.815+0000 7f4b4bc8d140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 09:54:18 compute-2 ceph-mgr[76120]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 09:54:18 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'snap_schedule'
Jan 23 09:54:18 compute-2 ceph-mon[75771]: Standby manager daemon compute-1.jmakme restarted
Jan 23 09:54:18 compute-2 ceph-mon[75771]: Standby manager daemon compute-1.jmakme started
Jan 23 09:54:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:18.935+0000 7f4b4bc8d140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 09:54:18 compute-2 ceph-mgr[76120]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 09:54:18 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'stats'
Jan 23 09:54:19 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'status'
Jan 23 09:54:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:19.126+0000 7f4b4bc8d140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 09:54:19 compute-2 ceph-mgr[76120]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 09:54:19 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'telegraf'
Jan 23 09:54:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:19.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:19.209+0000 7f4b4bc8d140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 09:54:19 compute-2 ceph-mgr[76120]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 09:54:19 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'telemetry'
Jan 23 09:54:19 compute-2 sudo[86154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kctyxdbhvapmlfmcelhsrvativoaydud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162058.8339174-159-89015204701080/AnsiballZ_file.py'
Jan 23 09:54:19 compute-2 sudo[86154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:54:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:19.396+0000 7f4b4bc8d140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 09:54:19 compute-2 ceph-mgr[76120]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 09:54:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:19 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'test_orchestrator'
Jan 23 09:54:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 09:54:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:19.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 09:54:19 compute-2 python3.9[86156]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:54:19 compute-2 sudo[86154]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:19 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:19 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:19.669+0000 7f4b4bc8d140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 09:54:19 compute-2 ceph-mgr[76120]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 09:54:19 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'volumes'
Jan 23 09:54:19 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e118 e118: 3 total, 3 up, 3 in
Jan 23 09:54:19 compute-2 ceph-mon[75771]: mgrmap e27: compute-0.nbdygh(active, since 2m), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:54:19 compute-2 ceph-mon[75771]: Active manager daemon compute-0.nbdygh restarted
Jan 23 09:54:19 compute-2 ceph-mon[75771]: Activating manager daemon compute-0.nbdygh
Jan 23 09:54:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:20.020+0000 7f4b4bc8d140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 09:54:20 compute-2 ceph-mgr[76120]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 09:54:20 compute-2 ceph-mgr[76120]: mgr[py] Loading python module 'zabbix'
Jan 23 09:54:20 compute-2 sudo[86307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dulpgpwdcnhbrdjwjyikdbrkzejxicrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162059.7832081-186-125552298636485/AnsiballZ_file.py'
Jan 23 09:54:20 compute-2 sudo[86307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:54:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:20.111+0000 7f4b4bc8d140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 09:54:20 compute-2 ceph-mgr[76120]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 09:54:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: [23/Jan/2026:09:54:20] ENGINE Bus STARTING
Jan 23 09:54:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: CherryPy Checker:
Jan 23 09:54:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: The Application mounted at '' has an empty config.
Jan 23 09:54:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 
Jan 23 09:54:20 compute-2 ceph-mgr[76120]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 23 09:54:20 compute-2 ceph-mgr[76120]: mgr load Constructed class from module: dashboard
Jan 23 09:54:20 compute-2 ceph-mgr[76120]: [prometheus DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 23 09:54:20 compute-2 ceph-mgr[76120]: mgr load Constructed class from module: prometheus
Jan 23 09:54:20 compute-2 ceph-mgr[76120]: [prometheus INFO root] server_addr: :: server_port: 9283
Jan 23 09:54:20 compute-2 ceph-mgr[76120]: [prometheus INFO root] Starting engine...
Jan 23 09:54:20 compute-2 ceph-mgr[76120]: [dashboard INFO root] server: ssl=no host=:: port=8443
Jan 23 09:54:20 compute-2 ceph-mgr[76120]: ms_deliver_dispatch: unhandled message 0x55f67dd3d860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Jan 23 09:54:20 compute-2 ceph-mgr[76120]: [prometheus INFO cherrypy.error] [23/Jan/2026:09:54:20] ENGINE Bus STARTING
Jan 23 09:54:20 compute-2 ceph-mgr[76120]: [dashboard INFO root] Configured CherryPy, starting engine...
Jan 23 09:54:20 compute-2 ceph-mgr[76120]: [dashboard INFO root] Starting engine...
Jan 23 09:54:20 compute-2 ceph-mgr[76120]: [dashboard INFO root] Engine started...
Jan 23 09:54:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: [23/Jan/2026:09:54:20] ENGINE Serving on http://:::9283
Jan 23 09:54:20 compute-2 python3.9[86309]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:54:20 compute-2 ceph-mgr[76120]: [prometheus INFO cherrypy.error] [23/Jan/2026:09:54:20] ENGINE Serving on http://:::9283
Jan 23 09:54:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: [23/Jan/2026:09:54:20] ENGINE Bus STARTED
Jan 23 09:54:20 compute-2 ceph-mgr[76120]: [prometheus INFO cherrypy.error] [23/Jan/2026:09:54:20] ENGINE Bus STARTED
Jan 23 09:54:20 compute-2 ceph-mgr[76120]: [prometheus INFO root] Engine started.
Jan 23 09:54:20 compute-2 sudo[86307]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:20 compute-2 sshd-session[86334]: Accepted publickey for ceph-admin from 192.168.122.100 port 40786 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:54:20 compute-2 systemd-logind[786]: New session 38 of user ceph-admin.
Jan 23 09:54:20 compute-2 systemd[1]: Started Session 38 of User ceph-admin.
Jan 23 09:54:20 compute-2 sshd-session[86334]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:54:20 compute-2 sudo[86364]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:54:20 compute-2 sudo[86364]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:20 compute-2 sudo[86364]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:20 compute-2 sudo[86416]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 23 09:54:20 compute-2 sudo[86416]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:20 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:20 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:20 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e118 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:54:20 compute-2 ceph-mon[75771]: osdmap e118: 3 total, 3 up, 3 in
Jan 23 09:54:20 compute-2 ceph-mon[75771]: mgrmap e28: compute-0.nbdygh(active, starting, since 0.0302434s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:54:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Jan 23 09:54:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 23 09:54:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 23 09:54:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-0.ymknms"}]: dispatch
Jan 23 09:54:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-2.prgzmm"}]: dispatch
Jan 23 09:54:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-1.bcvzvj"}]: dispatch
Jan 23 09:54:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr metadata", "who": "compute-0.nbdygh", "id": "compute-0.nbdygh"}]: dispatch
Jan 23 09:54:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr metadata", "who": "compute-2.uczrot", "id": "compute-2.uczrot"}]: dispatch
Jan 23 09:54:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr metadata", "who": "compute-1.jmakme", "id": "compute-1.jmakme"}]: dispatch
Jan 23 09:54:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Jan 23 09:54:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Jan 23 09:54:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:54:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mds metadata"}]: dispatch
Jan 23 09:54:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 09:54:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata"}]: dispatch
Jan 23 09:54:20 compute-2 ceph-mon[75771]: Manager daemon compute-0.nbdygh is now available
Jan 23 09:54:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:54:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.nbdygh/mirror_snapshot_schedule"}]: dispatch
Jan 23 09:54:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.nbdygh/trash_purge_schedule"}]: dispatch
Jan 23 09:54:20 compute-2 ceph-mon[75771]: Standby manager daemon compute-2.uczrot restarted
Jan 23 09:54:20 compute-2 ceph-mon[75771]: Standby manager daemon compute-2.uczrot started
Jan 23 09:54:21 compute-2 python3.9[86575]: ansible-ansible.builtin.service_facts Invoked
Jan 23 09:54:21 compute-2 podman[86606]: 2026-01-23 09:54:21.171609132 +0000 UTC m=+0.068906523 container exec 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 09:54:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 09:54:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:21.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 09:54:21 compute-2 network[86643]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 09:54:21 compute-2 network[86644]: 'network-scripts' will be removed from distribution in near future.
Jan 23 09:54:21 compute-2 network[86645]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 09:54:21 compute-2 podman[86606]: 2026-01-23 09:54:21.310197912 +0000 UTC m=+0.207495303 container exec_died 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 09:54:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:21.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:21 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:21 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095421 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 09:54:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [NOTICE] 022/095421 (4) : haproxy version is 2.3.17-d1c9119
Jan 23 09:54:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [NOTICE] 022/095421 (4) : path to executable is /usr/local/sbin/haproxy
Jan 23 09:54:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [ALERT] 022/095421 (4) : backend 'backend' has no server available!
Jan 23 09:54:22 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e119 e119: 3 total, 3 up, 3 in
Jan 23 09:54:22 compute-2 ceph-mon[75771]: mgrmap e29: compute-0.nbdygh(active, since 1.08581s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:54:22 compute-2 ceph-mon[75771]: [23/Jan/2026:09:54:21] ENGINE Bus STARTING
Jan 23 09:54:22 compute-2 ceph-mon[75771]: [23/Jan/2026:09:54:21] ENGINE Serving on http://192.168.122.100:8765
Jan 23 09:54:22 compute-2 ceph-mon[75771]: [23/Jan/2026:09:54:21] ENGINE Serving on https://192.168.122.100:7150
Jan 23 09:54:22 compute-2 ceph-mon[75771]: [23/Jan/2026:09:54:21] ENGINE Bus STARTED
Jan 23 09:54:22 compute-2 ceph-mon[75771]: [23/Jan/2026:09:54:21] ENGINE Client ('192.168.122.100', 52034) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Jan 23 09:54:22 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Jan 23 09:54:22 compute-2 podman[86744]: 2026-01-23 09:54:22.433648593 +0000 UTC m=+0.188800595 container exec 610246eaa4f552466bc8eee8cc806d516dfcdb85055d982f77b2276e24631ff0 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:54:22 compute-2 podman[86744]: 2026-01-23 09:54:22.449241818 +0000 UTC m=+0.204393790 container exec_died 610246eaa4f552466bc8eee8cc806d516dfcdb85055d982f77b2276e24631ff0 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:54:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:22 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:22 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:22 compute-2 podman[86884]: 2026-01-23 09:54:22.919283244 +0000 UTC m=+0.062377393 container exec 2988332eba883a20d1059e3b3728d769fb16998f721e57e41af0869ab3b7663e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:54:22 compute-2 podman[86884]: 2026-01-23 09:54:22.927761498 +0000 UTC m=+0.070855627 container exec_died 2988332eba883a20d1059e3b3728d769fb16998f721e57e41af0869ab3b7663e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 09:54:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:23 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Jan 23 09:54:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:23 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Jan 23 09:54:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:23 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 09:54:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:23 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:54:23 compute-2 podman[86961]: 2026-01-23 09:54:23.156045962 +0000 UTC m=+0.055562928 container exec c9fabe49ecac94420d698684f7f0ce6311c7531d91804b2e676e15edf98f3e26 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj)
Jan 23 09:54:23 compute-2 podman[86961]: 2026-01-23 09:54:23.17130775 +0000 UTC m=+0.070824726 container exec_died c9fabe49ecac94420d698684f7f0ce6311c7531d91804b2e676e15edf98f3e26 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj)
Jan 23 09:54:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:23.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:23.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:23 compute-2 podman[87040]: 2026-01-23 09:54:23.452055412 +0000 UTC m=+0.104688969 container exec 5f2fa996bf6c111f1884c2f59ff9b058d96f90c6e8935be1b7683b38269ec7aa (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, release=1793, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived)
Jan 23 09:54:23 compute-2 podman[87040]: 2026-01-23 09:54:23.468142279 +0000 UTC m=+0.120775866 container exec_died 5f2fa996bf6c111f1884c2f59ff9b058d96f90c6e8935be1b7683b38269ec7aa (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai, description=keepalived for Ceph, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, io.k8s.display-name=Keepalived on RHEL 9, io.buildah.version=1.28.2, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container)
Jan 23 09:54:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:23 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 09:54:23 compute-2 ceph-mon[75771]: pgmap v4: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:54:23 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Jan 23 09:54:23 compute-2 ceph-mon[75771]: osdmap e119: 3 total, 3 up, 3 in
Jan 23 09:54:23 compute-2 ceph-mon[75771]: mgrmap e30: compute-0.nbdygh(active, since 2s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:54:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:23 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:23 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:23 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:54:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:23 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 09:54:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:23 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:54:23 compute-2 sudo[86416]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:23 compute-2 sudo[87128]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:54:23 compute-2 sudo[87128]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:23 compute-2 sudo[87128]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:23 compute-2 sudo[87158]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 09:54:23 compute-2 sudo[87158]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:24 compute-2 sudo[87158]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:24 compute-2 sudo[87318]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:54:24 compute-2 sudo[87318]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:24 compute-2 sudo[87318]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:24 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:24 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:24 compute-2 sudo[87368]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Jan 23 09:54:24 compute-2 sudo[87368]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:24 compute-2 python3.9[87416]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:54:24 compute-2 sudo[87368]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:24 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:24 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:24 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:24 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:24 compute-2 ceph-mon[75771]: pgmap v6: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:54:24 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Jan 23 09:54:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 09:54:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:25.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 09:54:25 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e120 e120: 3 total, 3 up, 3 in
Jan 23 09:54:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:25.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:25 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:25 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:25 compute-2 sudo[87587]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:54:25 compute-2 sudo[87587]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:25 compute-2 python3.9[87586]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:54:25 compute-2 sudo[87587]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:25 compute-2 sudo[87613]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid f3005f84-239a-55b6-a948-8f1fb592b920 -- inventory --format=json-pretty --filter-for-batch
Jan 23 09:54:25 compute-2 sudo[87613]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:25 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e120 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:54:26 compute-2 podman[87705]: 2026-01-23 09:54:26.155658955 +0000 UTC m=+0.049354126 container create c60165b7bc290c13207dc0c2554021ab5cbc6270c9c0ab3d0b1db948d94bd07b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_swartz, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 09:54:26 compute-2 systemd[1]: Started libpod-conmon-c60165b7bc290c13207dc0c2554021ab5cbc6270c9c0ab3d0b1db948d94bd07b.scope.
Jan 23 09:54:26 compute-2 systemd[1]: Started libcrun container.
Jan 23 09:54:26 compute-2 podman[87705]: 2026-01-23 09:54:26.134014612 +0000 UTC m=+0.027709813 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:54:26 compute-2 podman[87705]: 2026-01-23 09:54:26.237211034 +0000 UTC m=+0.130906235 container init c60165b7bc290c13207dc0c2554021ab5cbc6270c9c0ab3d0b1db948d94bd07b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_swartz, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 09:54:26 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Jan 23 09:54:26 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:26 compute-2 ceph-mon[75771]: osdmap e120: 3 total, 3 up, 3 in
Jan 23 09:54:26 compute-2 ceph-mon[75771]: mgrmap e31: compute-0.nbdygh(active, since 5s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:54:26 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:26 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:26 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:26 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 09:54:26 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:26 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:26 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Jan 23 09:54:26 compute-2 podman[87705]: 2026-01-23 09:54:26.24886941 +0000 UTC m=+0.142564591 container start c60165b7bc290c13207dc0c2554021ab5cbc6270c9c0ab3d0b1db948d94bd07b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_swartz, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True)
Jan 23 09:54:26 compute-2 podman[87705]: 2026-01-23 09:54:26.253659229 +0000 UTC m=+0.147354540 container attach c60165b7bc290c13207dc0c2554021ab5cbc6270c9c0ab3d0b1db948d94bd07b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_swartz, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Jan 23 09:54:26 compute-2 brave_swartz[87721]: 167 167
Jan 23 09:54:26 compute-2 systemd[1]: libpod-c60165b7bc290c13207dc0c2554021ab5cbc6270c9c0ab3d0b1db948d94bd07b.scope: Deactivated successfully.
Jan 23 09:54:26 compute-2 podman[87705]: 2026-01-23 09:54:26.258896849 +0000 UTC m=+0.152592030 container died c60165b7bc290c13207dc0c2554021ab5cbc6270c9c0ab3d0b1db948d94bd07b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_swartz, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 09:54:26 compute-2 systemd[1]: var-lib-containers-storage-overlay-2d9b606e8a127c8f73b9b0b674ec1b55e1e503f2327fddc498db858558e8ada5-merged.mount: Deactivated successfully.
Jan 23 09:54:26 compute-2 podman[87705]: 2026-01-23 09:54:26.303814833 +0000 UTC m=+0.197510004 container remove c60165b7bc290c13207dc0c2554021ab5cbc6270c9c0ab3d0b1db948d94bd07b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_swartz, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2)
Jan 23 09:54:26 compute-2 systemd[1]: libpod-conmon-c60165b7bc290c13207dc0c2554021ab5cbc6270c9c0ab3d0b1db948d94bd07b.scope: Deactivated successfully.
Jan 23 09:54:26 compute-2 podman[87744]: 2026-01-23 09:54:26.45896831 +0000 UTC m=+0.043420501 container create 373756fe3f1f248643a306321abb7932a9e79ed1266276e0cb584ecb8157f981 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_kirch, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 09:54:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095426 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 1 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 09:54:26 compute-2 systemd[1]: Started libpod-conmon-373756fe3f1f248643a306321abb7932a9e79ed1266276e0cb584ecb8157f981.scope.
Jan 23 09:54:26 compute-2 systemd[1]: Started libcrun container.
Jan 23 09:54:26 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e39c4c5866c61c4e61303a94c9e6fdf8d183c645523a3b89450f5a9d8d95c649/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 09:54:26 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e39c4c5866c61c4e61303a94c9e6fdf8d183c645523a3b89450f5a9d8d95c649/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:54:26 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e39c4c5866c61c4e61303a94c9e6fdf8d183c645523a3b89450f5a9d8d95c649/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:54:26 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e39c4c5866c61c4e61303a94c9e6fdf8d183c645523a3b89450f5a9d8d95c649/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 09:54:26 compute-2 podman[87744]: 2026-01-23 09:54:26.439288281 +0000 UTC m=+0.023740482 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:54:26 compute-2 podman[87744]: 2026-01-23 09:54:26.540220282 +0000 UTC m=+0.124672493 container init 373756fe3f1f248643a306321abb7932a9e79ed1266276e0cb584ecb8157f981 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_kirch, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Jan 23 09:54:26 compute-2 podman[87744]: 2026-01-23 09:54:26.601060829 +0000 UTC m=+0.185513010 container start 373756fe3f1f248643a306321abb7932a9e79ed1266276e0cb584ecb8157f981 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_kirch, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Jan 23 09:54:26 compute-2 podman[87744]: 2026-01-23 09:54:26.604898447 +0000 UTC m=+0.189350658 container attach 373756fe3f1f248643a306321abb7932a9e79ed1266276e0cb584ecb8157f981 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_kirch, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 09:54:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:26 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:26 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:26 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e121 e121: 3 total, 3 up, 3 in
Jan 23 09:54:27 compute-2 python3.9[87892]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:54:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 09:54:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:27.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 09:54:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 09:54:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:27.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 09:54:27 compute-2 funny_kirch[87792]: [
Jan 23 09:54:27 compute-2 funny_kirch[87792]:     {
Jan 23 09:54:27 compute-2 funny_kirch[87792]:         "available": false,
Jan 23 09:54:27 compute-2 funny_kirch[87792]:         "being_replaced": false,
Jan 23 09:54:27 compute-2 funny_kirch[87792]:         "ceph_device_lvm": false,
Jan 23 09:54:27 compute-2 funny_kirch[87792]:         "device_id": "QEMU_DVD-ROM_QM00001",
Jan 23 09:54:27 compute-2 funny_kirch[87792]:         "lsm_data": {},
Jan 23 09:54:27 compute-2 funny_kirch[87792]:         "lvs": [],
Jan 23 09:54:27 compute-2 funny_kirch[87792]:         "path": "/dev/sr0",
Jan 23 09:54:27 compute-2 funny_kirch[87792]:         "rejected_reasons": [
Jan 23 09:54:27 compute-2 funny_kirch[87792]:             "Insufficient space (<5GB)",
Jan 23 09:54:27 compute-2 funny_kirch[87792]:             "Has a FileSystem"
Jan 23 09:54:27 compute-2 funny_kirch[87792]:         ],
Jan 23 09:54:27 compute-2 funny_kirch[87792]:         "sys_api": {
Jan 23 09:54:27 compute-2 funny_kirch[87792]:             "actuators": null,
Jan 23 09:54:27 compute-2 funny_kirch[87792]:             "device_nodes": [
Jan 23 09:54:27 compute-2 funny_kirch[87792]:                 "sr0"
Jan 23 09:54:27 compute-2 funny_kirch[87792]:             ],
Jan 23 09:54:27 compute-2 funny_kirch[87792]:             "devname": "sr0",
Jan 23 09:54:27 compute-2 funny_kirch[87792]:             "human_readable_size": "482.00 KB",
Jan 23 09:54:27 compute-2 funny_kirch[87792]:             "id_bus": "ata",
Jan 23 09:54:27 compute-2 funny_kirch[87792]:             "model": "QEMU DVD-ROM",
Jan 23 09:54:27 compute-2 funny_kirch[87792]:             "nr_requests": "2",
Jan 23 09:54:27 compute-2 funny_kirch[87792]:             "parent": "/dev/sr0",
Jan 23 09:54:27 compute-2 funny_kirch[87792]:             "partitions": {},
Jan 23 09:54:27 compute-2 funny_kirch[87792]:             "path": "/dev/sr0",
Jan 23 09:54:27 compute-2 funny_kirch[87792]:             "removable": "1",
Jan 23 09:54:27 compute-2 funny_kirch[87792]:             "rev": "2.5+",
Jan 23 09:54:27 compute-2 funny_kirch[87792]:             "ro": "0",
Jan 23 09:54:27 compute-2 funny_kirch[87792]:             "rotational": "1",
Jan 23 09:54:27 compute-2 funny_kirch[87792]:             "sas_address": "",
Jan 23 09:54:27 compute-2 funny_kirch[87792]:             "sas_device_handle": "",
Jan 23 09:54:27 compute-2 funny_kirch[87792]:             "scheduler_mode": "mq-deadline",
Jan 23 09:54:27 compute-2 funny_kirch[87792]:             "sectors": 0,
Jan 23 09:54:27 compute-2 funny_kirch[87792]:             "sectorsize": "2048",
Jan 23 09:54:27 compute-2 funny_kirch[87792]:             "size": 493568.0,
Jan 23 09:54:27 compute-2 funny_kirch[87792]:             "support_discard": "2048",
Jan 23 09:54:27 compute-2 funny_kirch[87792]:             "type": "disk",
Jan 23 09:54:27 compute-2 funny_kirch[87792]:             "vendor": "QEMU"
Jan 23 09:54:27 compute-2 funny_kirch[87792]:         }
Jan 23 09:54:27 compute-2 funny_kirch[87792]:     }
Jan 23 09:54:27 compute-2 funny_kirch[87792]: ]
Jan 23 09:54:27 compute-2 systemd[1]: libpod-373756fe3f1f248643a306321abb7932a9e79ed1266276e0cb584ecb8157f981.scope: Deactivated successfully.
Jan 23 09:54:27 compute-2 podman[87744]: 2026-01-23 09:54:27.561931015 +0000 UTC m=+1.146383216 container died 373756fe3f1f248643a306321abb7932a9e79ed1266276e0cb584ecb8157f981 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_kirch, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Jan 23 09:54:27 compute-2 ceph-mon[75771]: pgmap v8: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:54:27 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Jan 23 09:54:27 compute-2 ceph-mon[75771]: osdmap e121: 3 total, 3 up, 3 in
Jan 23 09:54:27 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:27 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:27 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 09:54:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:27 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:27 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:27 compute-2 systemd[1]: var-lib-containers-storage-overlay-e39c4c5866c61c4e61303a94c9e6fdf8d183c645523a3b89450f5a9d8d95c649-merged.mount: Deactivated successfully.
Jan 23 09:54:27 compute-2 podman[87744]: 2026-01-23 09:54:27.67579049 +0000 UTC m=+1.260242671 container remove 373756fe3f1f248643a306321abb7932a9e79ed1266276e0cb584ecb8157f981 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_kirch, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Jan 23 09:54:27 compute-2 systemd[1]: libpod-conmon-373756fe3f1f248643a306321abb7932a9e79ed1266276e0cb584ecb8157f981.scope: Deactivated successfully.
Jan 23 09:54:27 compute-2 sudo[87613]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:27 compute-2 sudo[89006]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Jan 23 09:54:27 compute-2 sudo[89006]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:27 compute-2 sudo[89006]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:27 compute-2 sudo[89061]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph
Jan 23 09:54:27 compute-2 sudo[89061]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:27 compute-2 sudo[89061]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-2 sudo[89112]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:54:28 compute-2 sudo[89112]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:28 compute-2 sudo[89112]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-2 sudo[89143]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:54:28 compute-2 sudo[89143]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:28 compute-2 sudo[89143]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-2 sudo[89220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bedlofnppwmlckqzinidlfnbfgejpjla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162067.8021786-330-28531071949228/AnsiballZ_setup.py'
Jan 23 09:54:28 compute-2 sudo[89220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:54:28 compute-2 sudo[89201]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:54:28 compute-2 sudo[89201]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:28 compute-2 sudo[89201]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-2 sudo[89263]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:54:28 compute-2 sudo[89263]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:28 compute-2 sudo[89263]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-2 sudo[89289]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:54:28 compute-2 sudo[89289]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:28 compute-2 sudo[89289]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-2 sudo[89314]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Jan 23 09:54:28 compute-2 sudo[89314]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:28 compute-2 sudo[89314]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-2 python3.9[89237]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:54:28 compute-2 sudo[89339]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:54:28 compute-2 sudo[89339]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:28 compute-2 sudo[89339]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-2 sudo[89369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:54:28 compute-2 sudo[89369]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:28 compute-2 sudo[89369]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-2 sudo[89394]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:54:28 compute-2 sudo[89394]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:28 compute-2 sudo[89394]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:28 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:28 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:28 compute-2 sudo[89419]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:54:28 compute-2 sudo[89419]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:28 compute-2 sudo[89419]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-2 sudo[89220]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-2 sudo[89447]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:54:28 compute-2 sudo[89447]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:28 compute-2 sudo[89447]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:28 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:28 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 09:54:28 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:54:28 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 09:54:28 compute-2 ceph-mon[75771]: Updating compute-0:/etc/ceph/ceph.conf
Jan 23 09:54:28 compute-2 ceph-mon[75771]: Updating compute-1:/etc/ceph/ceph.conf
Jan 23 09:54:28 compute-2 ceph-mon[75771]: Updating compute-2:/etc/ceph/ceph.conf
Jan 23 09:54:28 compute-2 ceph-mon[75771]: pgmap v10: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 31 KiB/s rd, 682 B/s wr, 14 op/s
Jan 23 09:54:28 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Jan 23 09:54:28 compute-2 ceph-mon[75771]: Updating compute-0:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:54:28 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e122 e122: 3 total, 3 up, 3 in
Jan 23 09:54:28 compute-2 sudo[89495]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:54:28 compute-2 sudo[89495]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:28 compute-2 sudo[89495]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-2 sudo[89520]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:54:29 compute-2 sudo[89617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnqegfjkmieugfjsnlllsonwnfydmvjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162067.8021786-330-28531071949228/AnsiballZ_dnf.py'
Jan 23 09:54:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:29.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 09:54:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:29.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 09:54:29 compute-2 sudo[89520]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:29 compute-2 sudo[89617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:54:29 compute-2 sudo[89520]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:29 compute-2 sudo[89621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:54:29 compute-2 sudo[89621]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:29 compute-2 sudo[89621]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:29 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:29 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:29 compute-2 sudo[89646]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Jan 23 09:54:29 compute-2 sudo[89646]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:29 compute-2 sudo[89646]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000005:nfs.cephfs.1: -2
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 09:54:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 09:54:29 compute-2 sudo[89671]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph
Jan 23 09:54:29 compute-2 sudo[89671]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:29 compute-2 sudo[89671]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:29 compute-2 ceph-mon[75771]: Updating compute-1:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:54:29 compute-2 ceph-mon[75771]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:54:29 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Jan 23 09:54:29 compute-2 ceph-mon[75771]: osdmap e122: 3 total, 3 up, 3 in
Jan 23 09:54:29 compute-2 ceph-mon[75771]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Jan 23 09:54:29 compute-2 ceph-mon[75771]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Jan 23 09:54:29 compute-2 python3.9[89619]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:54:29 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e123 e123: 3 total, 3 up, 3 in
Jan 23 09:54:29 compute-2 sudo[89708]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:54:29 compute-2 sudo[89708]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:29 compute-2 sudo[89708]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:29 compute-2 sudo[89734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:54:29 compute-2 sudo[89734]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:29 compute-2 sudo[89734]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:29 compute-2 sudo[89760]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:54:29 compute-2 sudo[89760]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:29 compute-2 sudo[89760]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:30 compute-2 sudo[89808]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:54:30 compute-2 sudo[89808]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:30 compute-2 sudo[89808]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:30 compute-2 sudo[89836]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:54:30 compute-2 sudo[89836]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:30 compute-2 sudo[89836]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:30 compute-2 sudo[89862]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Jan 23 09:54:30 compute-2 sudo[89862]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:30 compute-2 sudo[89862]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:30 compute-2 sudo[89889]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:54:30 compute-2 sudo[89889]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:30 compute-2 sudo[89889]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:30 compute-2 sudo[89918]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:54:30 compute-2 sudo[89918]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:30 compute-2 sudo[89918]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:30 compute-2 sudo[89943]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:54:30 compute-2 sudo[89943]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:30 compute-2 sudo[89943]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:30 compute-2 sudo[89971]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:54:30 compute-2 sudo[89971]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:30 compute-2 sudo[89971]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:30 compute-2 sudo[89996]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:54:30 compute-2 sudo[89996]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:30 compute-2 sudo[89996]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:30 compute-2 sudo[90048]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:54:30 compute-2 sudo[90048]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:30 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:30 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:30 compute-2 sudo[90048]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:30 compute-2 sudo[90075]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:54:30 compute-2 sudo[90075]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:30 compute-2 sudo[90075]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:30 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:54:30 compute-2 sudo[90100]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:54:30 compute-2 sudo[90100]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:30 compute-2 sudo[90100]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:30 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e124 e124: 3 total, 3 up, 3 in
Jan 23 09:54:30 compute-2 ceph-mon[75771]: Updating compute-0:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:54:30 compute-2 ceph-mon[75771]: Updating compute-1:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:54:30 compute-2 ceph-mon[75771]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 23 09:54:30 compute-2 ceph-mon[75771]: osdmap e123: 3 total, 3 up, 3 in
Jan 23 09:54:30 compute-2 ceph-mon[75771]: pgmap v13: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 40 KiB/s rd, 880 B/s wr, 18 op/s
Jan 23 09:54:30 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Jan 23 09:54:30 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:30 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:30 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:30 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:30 compute-2 ceph-mon[75771]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:54:30 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:30 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:30 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Jan 23 09:54:30 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:30 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff03c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:30 : epoch 69734548 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:54:30 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e125 e125: 3 total, 3 up, 3 in
Jan 23 09:54:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:31 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff0240016e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:31.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:31.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:31 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:31 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:31 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:31 compute-2 ceph-mon[75771]: pgmap v14: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 38 KiB/s rd, 836 B/s wr, 17 op/s
Jan 23 09:54:31 compute-2 ceph-mon[75771]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Jan 23 09:54:31 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Jan 23 09:54:31 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Jan 23 09:54:31 compute-2 ceph-mon[75771]: osdmap e124: 3 total, 3 up, 3 in
Jan 23 09:54:31 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:31 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 09:54:31 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 09:54:31 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:54:31 compute-2 ceph-mon[75771]: osdmap e125: 3 total, 3 up, 3 in
Jan 23 09:54:31 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e126 e126: 3 total, 3 up, 3 in
Jan 23 09:54:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:32 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:32 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:32 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:32 compute-2 ceph-mon[75771]: osdmap e126: 3 total, 3 up, 3 in
Jan 23 09:54:32 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Jan 23 09:54:33 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e127 e127: 3 total, 3 up, 3 in
Jan 23 09:54:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095433 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 09:54:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:33 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff03c000df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:33.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:33.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:33 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff03c000df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:33 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:33 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:33 : epoch 69734548 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 09:54:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:33 : epoch 69734548 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:54:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:33 : epoch 69734548 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:54:34 compute-2 ceph-mon[75771]: pgmap v18: 353 pgs: 1 active+recovering+remapped, 352 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 6.7 KiB/s rd, 3.0 KiB/s wr, 10 op/s; 1/227 objects misplaced (0.441%); 36 B/s, 2 objects/s recovering
Jan 23 09:54:34 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Jan 23 09:54:34 compute-2 ceph-mon[75771]: osdmap e127: 3 total, 3 up, 3 in
Jan 23 09:54:34 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e128 e128: 3 total, 3 up, 3 in
Jan 23 09:54:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:34 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:34 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:34 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:35 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018001b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:35 compute-2 ceph-mon[75771]: osdmap e128: 3 total, 3 up, 3 in
Jan 23 09:54:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Jan 23 09:54:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:54:35 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e129 e129: 3 total, 3 up, 3 in
Jan 23 09:54:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 09:54:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:35.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 09:54:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 09:54:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:35.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 09:54:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:35 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010000d00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:35 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:35 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:35 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:54:36 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e130 e130: 3 total, 3 up, 3 in
Jan 23 09:54:36 compute-2 ceph-mon[75771]: pgmap v21: 353 pgs: 1 active+recovering+remapped, 352 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 5.0 KiB/s rd, 2.3 KiB/s wr, 7 op/s; 1/227 objects misplaced (0.441%); 27 B/s, 1 objects/s recovering
Jan 23 09:54:36 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Jan 23 09:54:36 compute-2 ceph-mon[75771]: osdmap e129: 3 total, 3 up, 3 in
Jan 23 09:54:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:36 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:36 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:36 compute-2 sudo[90183]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:54:36 compute-2 sudo[90183]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:36 compute-2 sudo[90183]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:36 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff03c000df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:36 compute-2 sudo[90208]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 09:54:36 compute-2 sudo[90208]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:36 compute-2 sudo[90208]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:37 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:37 : epoch 69734548 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 09:54:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.002000044s ======
Jan 23 09:54:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:37.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000044s
Jan 23 09:54:37 compute-2 ceph-mon[75771]: osdmap e130: 3 total, 3 up, 3 in
Jan 23 09:54:37 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:37 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:37 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:37 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Jan 23 09:54:37 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 23 09:54:37 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Jan 23 09:54:37 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:54:37 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e131 e131: 3 total, 3 up, 3 in
Jan 23 09:54:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:37.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:37 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018001b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:37 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:37 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:38 compute-2 ceph-mon[75771]: pgmap v24: 353 pgs: 1 active+remapped, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 4.0 KiB/s rd, 1023 B/s wr, 4 op/s; 27 B/s, 0 objects/s recovering
Jan 23 09:54:38 compute-2 ceph-mon[75771]: Reconfiguring mon.compute-0 (monmap changed)...
Jan 23 09:54:38 compute-2 ceph-mon[75771]: Reconfiguring daemon mon.compute-0 on compute-0
Jan 23 09:54:38 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Jan 23 09:54:38 compute-2 ceph-mon[75771]: osdmap e131: 3 total, 3 up, 3 in
Jan 23 09:54:38 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:38 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:38 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.nbdygh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 23 09:54:38 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 09:54:38 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:54:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:38 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:38 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:38 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010001820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:39 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff03c008dc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:39.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:39 compute-2 ceph-mon[75771]: Reconfiguring mgr.compute-0.nbdygh (monmap changed)...
Jan 23 09:54:39 compute-2 ceph-mon[75771]: Reconfiguring daemon mgr.compute-0.nbdygh on compute-0
Jan 23 09:54:39 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:39 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:39 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 23 09:54:39 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:54:39 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Jan 23 09:54:39 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:39.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:39 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff03c008dc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:39 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e132 e132: 3 total, 3 up, 3 in
Jan 23 09:54:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:39 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:39 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:39 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 132 pg[10.1e( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=80/80 les/c/f=81/81/0 sis=132) [2] r=0 lpr=132 pi=[80,132)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:54:40 compute-2 ceph-mon[75771]: Reconfiguring crash.compute-0 (monmap changed)...
Jan 23 09:54:40 compute-2 ceph-mon[75771]: Reconfiguring daemon crash.compute-0 on compute-0
Jan 23 09:54:40 compute-2 ceph-mon[75771]: pgmap v26: 353 pgs: 1 active+remapped, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 3.5 KiB/s rd, 883 B/s wr, 4 op/s; 23 B/s, 0 objects/s recovering
Jan 23 09:54:40 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:40 compute-2 ceph-mon[75771]: Reconfiguring osd.1 (monmap changed)...
Jan 23 09:54:40 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Jan 23 09:54:40 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:54:40 compute-2 ceph-mon[75771]: Reconfiguring daemon osd.1 on compute-0
Jan 23 09:54:40 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Jan 23 09:54:40 compute-2 ceph-mon[75771]: osdmap e132: 3 total, 3 up, 3 in
Jan 23 09:54:40 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:40 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e133 e133: 3 total, 3 up, 3 in
Jan 23 09:54:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 133 pg[10.1e( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=80/80 les/c/f=81/81/0 sis=133) [2]/[0] r=-1 lpr=133 pi=[80,133)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:54:40 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 133 pg[10.1e( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=80/80 les/c/f=81/81/0 sis=133) [2]/[0] r=-1 lpr=133 pi=[80,133)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:54:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:40 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:40 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:40 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:54:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:40 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018001b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:41 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010001820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 09:54:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:41.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 09:54:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 09:54:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:41.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 09:54:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:41 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:41 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:41 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:41 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:41 compute-2 ceph-mon[75771]: Reconfiguring alertmanager.compute-0 (dependencies changed)...
Jan 23 09:54:41 compute-2 ceph-mon[75771]: Reconfiguring daemon alertmanager.compute-0 on compute-0
Jan 23 09:54:41 compute-2 ceph-mon[75771]: osdmap e133: 3 total, 3 up, 3 in
Jan 23 09:54:41 compute-2 ceph-mon[75771]: pgmap v29: 353 pgs: 1 active+remapped, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:54:41 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 09:54:41 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e134 e134: 3 total, 3 up, 3 in
Jan 23 09:54:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 134 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=103/104 n=5 ec=59/46 lis/c=103/103 les/c/f=104/104/0 sis=134 pruub=8.646332741s) [0] r=-1 lpr=134 pi=[103,134)/1 crt=62'761 mlcod 0'0 active pruub 168.579345703s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:54:41 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 134 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=103/104 n=5 ec=59/46 lis/c=103/103 les/c/f=104/104/0 sis=134 pruub=8.646267891s) [0] r=-1 lpr=134 pi=[103,134)/1 crt=62'761 mlcod 0'0 unknown NOTIFY pruub 168.579345703s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:54:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:42 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:42 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:42 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:43 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018002f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:43 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e135 e135: 3 total, 3 up, 3 in
Jan 23 09:54:43 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 135 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=103/104 n=5 ec=59/46 lis/c=103/103 les/c/f=104/104/0 sis=135) [0]/[2] r=0 lpr=135 pi=[103,135)/1 crt=62'761 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:54:43 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 135 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=103/104 n=5 ec=59/46 lis/c=103/103 les/c/f=104/104/0 sis=135) [0]/[2] r=0 lpr=135 pi=[103,135)/1 crt=62'761 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:54:43 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 135 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=0/0 n=5 ec=59/46 lis/c=133/80 les/c/f=134/81/0 sis=135) [2] r=0 lpr=135 pi=[80,135)/1 luod=0'0 crt=62'763 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:54:43 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 135 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=0/0 n=5 ec=59/46 lis/c=133/80 les/c/f=134/81/0 sis=135) [2] r=0 lpr=135 pi=[80,135)/1 crt=62'763 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:54:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 09:54:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:43.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 09:54:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 09:54:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:43.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 09:54:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:43 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010001820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:43 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:43 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095443 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 09:54:43 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 09:54:43 compute-2 ceph-mon[75771]: osdmap e134: 3 total, 3 up, 3 in
Jan 23 09:54:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:44 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:44 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:44 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:45 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e136 e136: 3 total, 3 up, 3 in
Jan 23 09:54:45 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 136 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=135/136 n=5 ec=59/46 lis/c=133/80 les/c/f=134/81/0 sis=135) [2] r=0 lpr=135 pi=[80,135)/1 crt=62'763 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:54:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:45 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:45.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 09:54:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:45.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 09:54:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:45 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018002f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:45 compute-2 ceph-mon[75771]: pgmap v31: 353 pgs: 1 remapped+peering, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 564 B/s rd, 188 B/s wr, 1 op/s
Jan 23 09:54:45 compute-2 ceph-mon[75771]: osdmap e135: 3 total, 3 up, 3 in
Jan 23 09:54:45 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 136 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=135/136 n=5 ec=59/46 lis/c=103/103 les/c/f=104/104/0 sis=135) [0]/[2] async=[0] r=0 lpr=135 pi=[103,135)/1 crt=62'761 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:54:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:45 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:45 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:45 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:54:46 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e137 e137: 3 total, 3 up, 3 in
Jan 23 09:54:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 137 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=135/136 n=5 ec=59/46 lis/c=135/103 les/c/f=136/104/0 sis=137 pruub=15.589160919s) [0] async=[0] r=-1 lpr=137 pi=[103,137)/1 crt=62'761 mlcod 62'761 active pruub 179.562911987s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:54:46 compute-2 ceph-osd[81231]: osd.2 pg_epoch: 137 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=135/136 n=5 ec=59/46 lis/c=135/103 les/c/f=136/104/0 sis=137 pruub=15.589061737s) [0] r=-1 lpr=137 pi=[103,137)/1 crt=62'761 mlcod 0'0 unknown NOTIFY pruub 179.562911987s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:54:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:46 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:46 compute-2 ceph-mon[75771]: pgmap v33: 353 pgs: 1 remapped+peering, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 588 B/s rd, 196 B/s wr, 1 op/s
Jan 23 09:54:46 compute-2 ceph-mon[75771]: osdmap e136: 3 total, 3 up, 3 in
Jan 23 09:54:46 compute-2 ceph-mon[75771]: osdmap e137: 3 total, 3 up, 3 in
Jan 23 09:54:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:46 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:46 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:46 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:46 compute-2 ceph-mon[75771]: Reconfiguring grafana.compute-0 (dependencies changed)...
Jan 23 09:54:46 compute-2 ceph-mon[75771]: Reconfiguring daemon grafana.compute-0 on compute-0
Jan 23 09:54:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:46 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010002cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:47 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 e138: 3 total, 3 up, 3 in
Jan 23 09:54:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:47 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 09:54:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:47.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 09:54:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:47.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:47 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff03c009ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:47 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:47 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:48 compute-2 ceph-mon[75771]: pgmap v36: 353 pgs: 1 active+remapped, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 23 B/s, 2 objects/s recovering
Jan 23 09:54:48 compute-2 ceph-mon[75771]: osdmap e138: 3 total, 3 up, 3 in
Jan 23 09:54:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:48 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:48 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:48 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:49 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010002cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 09:54:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:49.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 09:54:49 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:49 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:49 compute-2 ceph-mon[75771]: Reconfiguring crash.compute-1 (monmap changed)...
Jan 23 09:54:49 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 23 09:54:49 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:54:49 compute-2 ceph-mon[75771]: Reconfiguring daemon crash.compute-1 on compute-1
Jan 23 09:54:49 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:49 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:49 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Jan 23 09:54:49 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:54:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 09:54:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:49.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 09:54:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:49 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:49 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:49 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:50 compute-2 ceph-mon[75771]: pgmap v38: 353 pgs: 1 active+remapped, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 20 B/s, 1 objects/s recovering
Jan 23 09:54:50 compute-2 ceph-mon[75771]: Reconfiguring osd.0 (monmap changed)...
Jan 23 09:54:50 compute-2 ceph-mon[75771]: Reconfiguring daemon osd.0 on compute-1
Jan 23 09:54:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 23 09:54:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Jan 23 09:54:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:54:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:54:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:50 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:50 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:50 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:54:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:50 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff03c009ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:51 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:51.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:51 compute-2 ceph-mon[75771]: Reconfiguring mon.compute-1 (monmap changed)...
Jan 23 09:54:51 compute-2 ceph-mon[75771]: Reconfiguring daemon mon.compute-1 on compute-1
Jan 23 09:54:51 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:51 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:51.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:51 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:51 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:51 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:51 compute-2 sudo[90294]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:54:51 compute-2 sudo[90294]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:51 compute-2 sudo[90294]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:51 compute-2 sudo[90319]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:54:51 compute-2 sudo[90319]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:52 compute-2 podman[90363]: 2026-01-23 09:54:52.270893417 +0000 UTC m=+0.047709238 container create 1d129b112d968c9c6cc7557faa9abab01f5d2ae594e6911974bb38a68e2e975b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_keller, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 09:54:52 compute-2 systemd[1]: Started libpod-conmon-1d129b112d968c9c6cc7557faa9abab01f5d2ae594e6911974bb38a68e2e975b.scope.
Jan 23 09:54:52 compute-2 podman[90363]: 2026-01-23 09:54:52.250179565 +0000 UTC m=+0.026995406 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:54:52 compute-2 systemd[1]: Started libcrun container.
Jan 23 09:54:52 compute-2 podman[90363]: 2026-01-23 09:54:52.37540016 +0000 UTC m=+0.152216001 container init 1d129b112d968c9c6cc7557faa9abab01f5d2ae594e6911974bb38a68e2e975b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_keller, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Jan 23 09:54:52 compute-2 podman[90363]: 2026-01-23 09:54:52.38506498 +0000 UTC m=+0.161880801 container start 1d129b112d968c9c6cc7557faa9abab01f5d2ae594e6911974bb38a68e2e975b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_keller, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 09:54:52 compute-2 fervent_keller[90380]: 167 167
Jan 23 09:54:52 compute-2 ceph-mon[75771]: Reconfiguring node-exporter.compute-1 (unknown last config time)...
Jan 23 09:54:52 compute-2 ceph-mon[75771]: Reconfiguring daemon node-exporter.compute-1 on compute-1
Jan 23 09:54:52 compute-2 ceph-mon[75771]: pgmap v39: 353 pgs: 1 active+remapped, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 18 B/s, 1 objects/s recovering
Jan 23 09:54:52 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:52 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:52 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 23 09:54:52 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Jan 23 09:54:52 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:54:52 compute-2 podman[90363]: 2026-01-23 09:54:52.393435191 +0000 UTC m=+0.170251012 container attach 1d129b112d968c9c6cc7557faa9abab01f5d2ae594e6911974bb38a68e2e975b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_keller, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Jan 23 09:54:52 compute-2 systemd[1]: libpod-1d129b112d968c9c6cc7557faa9abab01f5d2ae594e6911974bb38a68e2e975b.scope: Deactivated successfully.
Jan 23 09:54:52 compute-2 podman[90363]: 2026-01-23 09:54:52.39736649 +0000 UTC m=+0.174182311 container died 1d129b112d968c9c6cc7557faa9abab01f5d2ae594e6911974bb38a68e2e975b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_keller, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0)
Jan 23 09:54:52 compute-2 systemd[1]: var-lib-containers-storage-overlay-0e5cc20f44840a878b108cbc1f67133a8c09989ae827e057a2ebfa0b36f14ff4-merged.mount: Deactivated successfully.
Jan 23 09:54:52 compute-2 podman[90363]: 2026-01-23 09:54:52.449767915 +0000 UTC m=+0.226583736 container remove 1d129b112d968c9c6cc7557faa9abab01f5d2ae594e6911974bb38a68e2e975b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_keller, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Jan 23 09:54:52 compute-2 systemd[1]: libpod-conmon-1d129b112d968c9c6cc7557faa9abab01f5d2ae594e6911974bb38a68e2e975b.scope: Deactivated successfully.
Jan 23 09:54:52 compute-2 sudo[90319]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:52 compute-2 sudo[90396]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:54:52 compute-2 sudo[90396]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:52 compute-2 sudo[90396]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:52 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:52 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:52 compute-2 sudo[90421]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:54:52 compute-2 sudo[90421]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:52 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010002cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:53 compute-2 podman[90461]: 2026-01-23 09:54:53.033692327 +0000 UTC m=+0.047536625 container create dfdb72613555460c5a24bba1a4e33d2efce25b8309a69866f71d4d886df29eec (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_bohr, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 09:54:53 compute-2 systemd[1]: Started libpod-conmon-dfdb72613555460c5a24bba1a4e33d2efce25b8309a69866f71d4d886df29eec.scope.
Jan 23 09:54:53 compute-2 systemd[1]: Started libcrun container.
Jan 23 09:54:53 compute-2 podman[90461]: 2026-01-23 09:54:53.012357951 +0000 UTC m=+0.026202279 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:54:53 compute-2 podman[90461]: 2026-01-23 09:54:53.118322476 +0000 UTC m=+0.132166794 container init dfdb72613555460c5a24bba1a4e33d2efce25b8309a69866f71d4d886df29eec (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_bohr, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS)
Jan 23 09:54:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:53 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff03c00a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:53 compute-2 podman[90461]: 2026-01-23 09:54:53.143342427 +0000 UTC m=+0.157186725 container start dfdb72613555460c5a24bba1a4e33d2efce25b8309a69866f71d4d886df29eec (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_bohr, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1)
Jan 23 09:54:53 compute-2 focused_bohr[90477]: 167 167
Jan 23 09:54:53 compute-2 systemd[1]: libpod-dfdb72613555460c5a24bba1a4e33d2efce25b8309a69866f71d4d886df29eec.scope: Deactivated successfully.
Jan 23 09:54:53 compute-2 podman[90461]: 2026-01-23 09:54:53.151725268 +0000 UTC m=+0.165569576 container attach dfdb72613555460c5a24bba1a4e33d2efce25b8309a69866f71d4d886df29eec (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_bohr, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid)
Jan 23 09:54:53 compute-2 podman[90461]: 2026-01-23 09:54:53.152579608 +0000 UTC m=+0.166423906 container died dfdb72613555460c5a24bba1a4e33d2efce25b8309a69866f71d4d886df29eec (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_bohr, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 09:54:53 compute-2 systemd[1]: var-lib-containers-storage-overlay-99de69510b031bdf857af35900514889eda3e4df043bde2fb09067e31662f7d8-merged.mount: Deactivated successfully.
Jan 23 09:54:53 compute-2 podman[90461]: 2026-01-23 09:54:53.200153012 +0000 UTC m=+0.213997310 container remove dfdb72613555460c5a24bba1a4e33d2efce25b8309a69866f71d4d886df29eec (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_bohr, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Jan 23 09:54:53 compute-2 systemd[1]: libpod-conmon-dfdb72613555460c5a24bba1a4e33d2efce25b8309a69866f71d4d886df29eec.scope: Deactivated successfully.
Jan 23 09:54:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 09:54:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:53.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 09:54:53 compute-2 sudo[90421]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:53 compute-2 ceph-mon[75771]: Reconfiguring mon.compute-2 (monmap changed)...
Jan 23 09:54:53 compute-2 ceph-mon[75771]: Reconfiguring daemon mon.compute-2 on compute-2
Jan 23 09:54:53 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:53 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:53 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 23 09:54:53 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:54:53 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:53 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:53 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch
Jan 23 09:54:53 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch
Jan 23 09:54:53 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Jan 23 09:54:53 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:53.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:53 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:53 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:53 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:54 compute-2 ceph-mon[75771]: Reconfiguring crash.compute-2 (unknown last config time)...
Jan 23 09:54:54 compute-2 ceph-mon[75771]: Reconfiguring daemon crash.compute-2 on compute-2
Jan 23 09:54:54 compute-2 ceph-mon[75771]: pgmap v40: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 419 B/s rd, 0 op/s; 15 B/s, 1 objects/s recovering
Jan 23 09:54:54 compute-2 ceph-mon[75771]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch
Jan 23 09:54:54 compute-2 ceph-mon[75771]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch
Jan 23 09:54:54 compute-2 ceph-mon[75771]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Jan 23 09:54:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:54 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:54 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:54 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:55 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010002cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 09:54:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:55.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 09:54:55 compute-2 ceph-mon[75771]: pgmap v41: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 350 B/s rd, 0 op/s
Jan 23 09:54:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:55.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:55 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff03c00a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:55 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:55 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:55 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:54:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:56 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:56 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:56 compute-2 sudo[90519]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:54:56 compute-2 sudo[90519]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:56 compute-2 sudo[90519]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:56 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:56 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:56 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:57 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 09:54:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:57.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 09:54:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 09:54:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:57.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 09:54:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:57 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:57 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:57 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:57 compute-2 ceph-mon[75771]: pgmap v42: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 307 B/s rd, 0 op/s
Jan 23 09:54:57 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:57 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:57 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:54:57 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 09:54:57 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:57 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:57 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 09:54:57 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 09:54:57 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:54:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:58 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:58 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:58 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff03c00a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:58 compute-2 ceph-mon[75771]: pgmap v43: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 286 B/s rd, 0 op/s
Jan 23 09:54:58 compute-2 ceph-mon[75771]: Health check cleared: CEPHADM_FAILED_DAEMON (was: 1 failed cephadm daemon(s))
Jan 23 09:54:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:59 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:59.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:54:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 09:54:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:59.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 09:54:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:59 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:59 2026: (VI_0) received an invalid passwd!
Jan 23 09:54:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:59 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:00 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:00 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:00 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:55:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:00 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:01 compute-2 ceph-mon[75771]: pgmap v44: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 279 B/s rd, 0 op/s
Jan 23 09:55:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:01 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff03c00a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:01.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:01.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:01 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:01 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:01 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:02 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:02 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:02 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:03 compute-2 sudo[90557]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 09:55:03 compute-2 sudo[90557]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:55:03 compute-2 sudo[90557]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:03 compute-2 ceph-mon[75771]: pgmap v45: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 559 B/s rd, 0 op/s
Jan 23 09:55:03 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:55:03 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:55:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:03 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 09:55:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:03.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 09:55:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:03.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:03 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034001230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:03 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:03 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:04 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:04 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:04 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:05 compute-2 ceph-mon[75771]: pgmap v46: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 279 B/s rd, 0 op/s
Jan 23 09:55:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:55:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:55:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:05 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:05.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 09:55:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:05.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 09:55:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:05 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:05 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:05 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:05 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:55:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:06 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:06 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:06 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034001d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:07 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:07 compute-2 ceph-mon[75771]: pgmap v47: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 279 B/s rd, 0 op/s
Jan 23 09:55:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:07.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:07.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:07 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:07 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:07 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:08 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:08 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:08 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:09 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034001d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:09 compute-2 ceph-mon[75771]: pgmap v48: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 279 B/s rd, 0 op/s
Jan 23 09:55:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:09.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:55:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:09.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:55:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:09 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:09 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:09 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:10 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:10 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:10 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:55:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:10 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:11 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:11.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:11.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:11 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034002a40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:11 compute-2 ceph-mon[75771]: pgmap v49: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:55:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:11 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:11 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:12 compute-2 ceph-mon[75771]: pgmap v50: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 09:55:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:12 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:12 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:12 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:13 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:55:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:13.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:55:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:13.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:13 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:13 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:13 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:14 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:14 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:14 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034002bc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:14 compute-2 ceph-mon[75771]: pgmap v51: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:55:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:15 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:55:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:15.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:55:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:15.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:15 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:15 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:15 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:15 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:55:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:16 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:16 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:16 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:16 compute-2 sudo[90596]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:55:16 compute-2 sudo[90596]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:55:16 compute-2 sudo[90596]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:17 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff0340034e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:17.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:17 compute-2 ceph-mon[75771]: pgmap v52: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:55:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:17.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:17 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:17 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:17 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:18 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:18 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:18 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:19 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:19.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:19 compute-2 ceph-mon[75771]: pgmap v53: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:55:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:55:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:19.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:55:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:19 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff0340034e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:19 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:19 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:55:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:20 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:20 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:20 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:55:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:20 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:21 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:55:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:21.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:55:21 compute-2 ceph-mon[75771]: pgmap v54: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:55:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:55:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:21.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:55:21 compute-2 sudo[89617]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:21 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:21 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:21 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:22 compute-2 ceph-mon[75771]: pgmap v55: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 510 B/s rd, 0 op/s
Jan 23 09:55:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:22 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:22 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:22 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff0340034e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:23 compute-2 sudo[90776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfsnbbsheoexjglkvhhqfcxqyadjhfoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162122.786475-367-96269215647801/AnsiballZ_command.py'
Jan 23 09:55:23 compute-2 sudo[90776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:23 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024003cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:23 compute-2 python3.9[90778]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:55:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:55:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:23.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:55:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:55:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:23.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:55:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:23 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024003cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:23 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:23 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:24 compute-2 sudo[90776]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:24 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:24 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:24 compute-2 ceph-mon[75771]: pgmap v56: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:55:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:24 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024003cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:25 compute-2 sudo[91065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfocvyvxgfukhmpuojgjsvvvlzuruexf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162124.481452-391-58849931922707/AnsiballZ_selinux.py'
Jan 23 09:55:25 compute-2 sudo[91065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:25 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:55:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:25.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:55:25 compute-2 python3.9[91067]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 23 09:55:25 compute-2 sudo[91065]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:25.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:25 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:25 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:25 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:25 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:55:26 compute-2 sudo[91219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyukahjondsmzuuwtzkxblaegmggnipr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162125.9498067-423-256064944694108/AnsiballZ_command.py'
Jan 23 09:55:26 compute-2 sudo[91219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:26 compute-2 python3.9[91221]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 23 09:55:26 compute-2 sudo[91219]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:26 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:26 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:26 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:26 compute-2 ceph-mon[75771]: pgmap v57: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:55:27 compute-2 sudo[91371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aixtmlzefwodmpeeupirxiywfsodolqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162126.7802634-448-55741267799790/AnsiballZ_file.py'
Jan 23 09:55:27 compute-2 sudo[91371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:27 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:27 compute-2 python3.9[91373]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:55:27 compute-2 sudo[91371]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:27.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:27.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:27 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:27 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:27 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:27 compute-2 sudo[91524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftwpzoemoprisnhcdrtydjopwjrjzqoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162127.4789696-472-240619902923865/AnsiballZ_mount.py'
Jan 23 09:55:27 compute-2 sudo[91524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:28 compute-2 python3.9[91526]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 23 09:55:28 compute-2 sudo[91524]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:28 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:28 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:28 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024003cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:55:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:29.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:55:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:29.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:29 compute-2 ceph-mon[75771]: pgmap v58: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:55:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:29 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:29 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:30 compute-2 sudo[91678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pptljnmbsxugjtfqodqitrrbaunoxcwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162129.5986655-556-25735491927756/AnsiballZ_file.py'
Jan 23 09:55:30 compute-2 sudo[91678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:30 compute-2 python3.9[91680]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:55:30 compute-2 sudo[91678]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:30 compute-2 ceph-mon[75771]: pgmap v59: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:55:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:30 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:30 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:30 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:55:30 compute-2 sudo[91831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvhcmvmenybzsiducvdzckjrvcmpnglq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162130.5281646-580-82080728660201/AnsiballZ_stat.py'
Jan 23 09:55:30 compute-2 sudo[91831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:30 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:31 compute-2 python3.9[91833]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:55:31 compute-2 sudo[91831]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:31 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:31 compute-2 sudo[91909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyorkibkjxncwrwysgpuiqkvkqekdblf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162130.5281646-580-82080728660201/AnsiballZ_file.py'
Jan 23 09:55:31 compute-2 sudo[91909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:55:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:31.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:55:31 compute-2 python3.9[91911]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:55:31 compute-2 sudo[91909]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:55:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:31.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:55:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:31 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024003cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:31 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:31 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:32 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:32 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:32 compute-2 sudo[92063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hoervospxluxhcvwxvobwnzylmzstrsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162132.4347422-643-191594339428297/AnsiballZ_stat.py'
Jan 23 09:55:32 compute-2 sudo[92063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:32 compute-2 python3.9[92065]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:55:32 compute-2 sudo[92063]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:32 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:33 compute-2 ceph-mon[75771]: pgmap v60: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 425 B/s rd, 0 op/s
Jan 23 09:55:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:33 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:33.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:55:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:33.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:55:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:33 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff03c0089d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:33 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:33 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095533 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 09:55:33 compute-2 sudo[92219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emcwkjpxgouavzarcrpicaxrcsoznbfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162133.5384927-682-208870631390019/AnsiballZ_getent.py'
Jan 23 09:55:33 compute-2 sudo[92219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:34 compute-2 python3.9[92221]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 23 09:55:34 compute-2 sudo[92219]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:34 compute-2 ceph-mon[75771]: pgmap v61: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 09:55:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:34 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:34 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:34 compute-2 sudo[92373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpqwpmsxekmupyfakblndjzvnknyossn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162134.5179408-712-185355914950544/AnsiballZ_getent.py'
Jan 23 09:55:34 compute-2 sudo[92373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:34 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024003cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:34 compute-2 python3.9[92375]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 23 09:55:34 compute-2 sudo[92373]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:35 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:55:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:35.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:55:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:35.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:35 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:35 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:35 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:35 compute-2 sudo[92527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwmvbwxvsydtblrvoswkygwcqkqvshtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162135.2276566-736-18606189495662/AnsiballZ_group.py'
Jan 23 09:55:35 compute-2 sudo[92527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:35 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:55:35 compute-2 python3.9[92529]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 09:55:35 compute-2 sudo[92527]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:36 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:55:36 compute-2 sudo[92681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpmnjddbaxlcyeymptkujltcvhulqwoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162136.283593-763-244089102636425/AnsiballZ_file.py'
Jan 23 09:55:36 compute-2 sudo[92681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:36 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:36 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:36 compute-2 python3.9[92683]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 23 09:55:36 compute-2 sudo[92681]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:36 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:37 compute-2 sudo[92708]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:55:37 compute-2 sudo[92708]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:55:37 compute-2 sudo[92708]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:37 compute-2 ceph-mon[75771]: pgmap v62: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 09:55:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:37 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024003cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:37.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:37.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:37 compute-2 sudo[92858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcrqmiieuacaxalnlqevevqqfhjifaxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162137.3399656-796-18842342010335/AnsiballZ_dnf.py'
Jan 23 09:55:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:37 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003d10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:37 compute-2 sudo[92858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:37 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:37 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:37 compute-2 python3.9[92860]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:55:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:38 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:38 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:38 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff0140016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:39 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:55:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:39.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:55:39 compute-2 ceph-mon[75771]: pgmap v63: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 09:55:39 compute-2 sudo[92858]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:39.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:39 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024003cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:39 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:39 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:39 compute-2 sudo[93013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mynhrqnrfcyuhqekfrrhkdaybexpmzxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162139.581129-820-120744186584256/AnsiballZ_file.py'
Jan 23 09:55:39 compute-2 sudo[93013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:40 compute-2 python3.9[93015]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:55:40 compute-2 sudo[93013]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:40 compute-2 ceph-mon[75771]: pgmap v64: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 09:55:40 compute-2 sudo[93167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbqukmuihzeproymprntyristagtadsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162140.3311763-844-210359028117754/AnsiballZ_stat.py'
Jan 23 09:55:40 compute-2 sudo[93167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:40 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:40 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:40 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:55:40 compute-2 python3.9[93169]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:55:40 compute-2 sudo[93167]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:40 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:41 compute-2 sudo[93245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prnwsoqtgbbowqugqkvazzwsnkdmvwkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162140.3311763-844-210359028117754/AnsiballZ_file.py'
Jan 23 09:55:41 compute-2 sudo[93245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:41 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff0140016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:55:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:41.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:55:41 compute-2 python3.9[93247]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:55:41 compute-2 sudo[93245]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:41.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:41 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:41 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:41 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:41 compute-2 sudo[93397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzuikdopsbjtyaweemgbrswguddscjqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162141.5704834-883-182685155975704/AnsiballZ_stat.py'
Jan 23 09:55:41 compute-2 sudo[93397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:42 compute-2 python3.9[93399]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:55:42 compute-2 sudo[93397]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:42 compute-2 sudo[93476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdurcsypieieodeaizhzphyfssmspiwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162141.5704834-883-182685155975704/AnsiballZ_file.py'
Jan 23 09:55:42 compute-2 sudo[93476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:42 compute-2 python3.9[93479]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:55:42 compute-2 sudo[93476]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:42 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:42 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:42 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003d50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:43 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003d50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:55:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:43.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:55:43 compute-2 sudo[93630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwojevlqjiilgzqlahcjgquerzhwuner ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162143.0573857-928-212274154621987/AnsiballZ_dnf.py'
Jan 23 09:55:43 compute-2 sudo[93630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:43 compute-2 ceph-mon[75771]: pgmap v65: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:55:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:55:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:43.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:55:43 compute-2 python3.9[93632]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:55:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:43 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003d50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:43 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:43 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:44 : epoch 69734548 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:55:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:44 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:44 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:44 compute-2 ceph-mon[75771]: pgmap v66: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:55:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:44 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff0140016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:44 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Jan 23 09:55:44 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:44.942544) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 09:55:44 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Jan 23 09:55:44 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162144942800, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 2005, "num_deletes": 251, "total_data_size": 8191375, "memory_usage": 8441032, "flush_reason": "Manual Compaction"}
Jan 23 09:55:44 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Jan 23 09:55:44 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162144984112, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 5055840, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8516, "largest_seqno": 10515, "table_properties": {"data_size": 5047090, "index_size": 5308, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 19703, "raw_average_key_size": 20, "raw_value_size": 5028702, "raw_average_value_size": 5332, "num_data_blocks": 236, "num_entries": 943, "num_filter_entries": 943, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162058, "oldest_key_time": 1769162058, "file_creation_time": 1769162144, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:55:44 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 41566 microseconds, and 19614 cpu microseconds.
Jan 23 09:55:44 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 09:55:44 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:44.984189) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 5055840 bytes OK
Jan 23 09:55:44 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:44.984219) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Jan 23 09:55:44 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:44.986184) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Jan 23 09:55:44 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:44.986211) EVENT_LOG_v1 {"time_micros": 1769162144986207, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 09:55:44 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:44.986233) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 09:55:44 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 8181745, prev total WAL file size 8181745, number of live WAL files 2.
Jan 23 09:55:44 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:55:44 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:44.987840) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Jan 23 09:55:44 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 09:55:44 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(4937KB)], [18(11MB)]
Jan 23 09:55:44 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162144988123, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 17496594, "oldest_snapshot_seqno": -1}
Jan 23 09:55:45 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 4120 keys, 13781623 bytes, temperature: kUnknown
Jan 23 09:55:45 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162145106661, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 13781623, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13748045, "index_size": 22204, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10309, "raw_key_size": 104991, "raw_average_key_size": 25, "raw_value_size": 13666591, "raw_average_value_size": 3317, "num_data_blocks": 954, "num_entries": 4120, "num_filter_entries": 4120, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769162144, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:55:45 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 09:55:45 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:45.107304) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 13781623 bytes
Jan 23 09:55:45 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:45.108501) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 147.1 rd, 115.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.8, 11.9 +0.0 blob) out(13.1 +0.0 blob), read-write-amplify(6.2) write-amplify(2.7) OK, records in: 4654, records dropped: 534 output_compression: NoCompression
Jan 23 09:55:45 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:45.108532) EVENT_LOG_v1 {"time_micros": 1769162145108518, "job": 8, "event": "compaction_finished", "compaction_time_micros": 118949, "compaction_time_cpu_micros": 44602, "output_level": 6, "num_output_files": 1, "total_output_size": 13781623, "num_input_records": 4654, "num_output_records": 4120, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 09:55:45 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:55:45 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162145109531, "job": 8, "event": "table_file_deletion", "file_number": 20}
Jan 23 09:55:45 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:55:45 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162145111323, "job": 8, "event": "table_file_deletion", "file_number": 18}
Jan 23 09:55:45 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:44.987659) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:55:45 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:45.111485) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:55:45 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:45.111490) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:55:45 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:45.111492) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:55:45 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:45.111493) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:55:45 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:45.111496) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:55:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:45 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:45 compute-2 sudo[93630]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:55:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:45.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:55:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:45.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:45 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:45 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:45 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:45 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:55:46 compute-2 python3.9[93786]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:55:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:46 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:46 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:46 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003d70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:46 compute-2 ceph-mon[75771]: pgmap v67: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:55:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:47 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:47 compute-2 python3.9[93939]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 23 09:55:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:55:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:47.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:55:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:55:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:47.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:55:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:47 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff0080016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:47 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:47 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:47 : epoch 69734548 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 09:55:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:47 : epoch 69734548 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:55:48 compute-2 python3.9[94090]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:55:48 compute-2 sshd-session[71309]: Received disconnect from 38.129.56.17 port 45444:11: disconnected by user
Jan 23 09:55:48 compute-2 sshd-session[71309]: Disconnected from user zuul 38.129.56.17 port 45444
Jan 23 09:55:48 compute-2 sshd-session[71306]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:55:48 compute-2 systemd[1]: session-19.scope: Deactivated successfully.
Jan 23 09:55:48 compute-2 systemd[1]: session-19.scope: Consumed 9.049s CPU time.
Jan 23 09:55:48 compute-2 systemd-logind[786]: Session 19 logged out. Waiting for processes to exit.
Jan 23 09:55:48 compute-2 systemd-logind[786]: Removed session 19.
Jan 23 09:55:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:48 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:48 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:48 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:48 compute-2 ceph-mon[75771]: pgmap v68: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 09:55:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:49 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003d90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:55:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:49.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:55:49 compute-2 sudo[94241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nozivemuvfxmdpybxofnaqfeyoywlbat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162148.8034859-1051-13473721938663/AnsiballZ_systemd.py'
Jan 23 09:55:49 compute-2 sudo[94241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:55:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:49.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:55:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:49 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:49 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:49 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:49 compute-2 python3.9[94243]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:55:49 compute-2 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 23 09:55:49 compute-2 systemd[1]: tuned.service: Deactivated successfully.
Jan 23 09:55:49 compute-2 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 23 09:55:49 compute-2 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 23 09:55:50 compute-2 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 23 09:55:50 compute-2 sudo[94241]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:50 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:50 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:50 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:55:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:50 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff0080016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:50 : epoch 69734548 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 09:55:50 compute-2 ceph-mon[75771]: pgmap v69: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 09:55:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:55:51 compute-2 python3.9[94408]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 23 09:55:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:51 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:55:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:51.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:55:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:55:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:51.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:55:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:51 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:51 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:51 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:52 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:52 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:52 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:53 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:53 compute-2 ceph-mon[75771]: pgmap v70: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1.2 KiB/s wr, 3 op/s
Jan 23 09:55:53 compute-2 ceph-mon[75771]: mgrmap e32: compute-0.nbdygh(active, since 92s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:55:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:55:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:53.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:55:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:53.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:53 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:53 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:53 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:54 compute-2 ceph-mon[75771]: pgmap v71: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 1.2 KiB/s wr, 2 op/s
Jan 23 09:55:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:54 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:54 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:54 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:55 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008002720 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:55.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:55 compute-2 sudo[94562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnfwymqrjxyroplkpqiuxpdakpytuhwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162155.2167497-1222-31491425027229/AnsiballZ_systemd.py'
Jan 23 09:55:55 compute-2 sudo[94562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:55.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:55 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:55 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:55 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:55 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:55:55 compute-2 python3.9[94564]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:55:55 compute-2 sudo[94562]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095555 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 09:55:56 compute-2 sudo[94718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btsxyrxcsncyxautqprtbenpjjdpjeyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162156.0199566-1222-128511367650381/AnsiballZ_systemd.py'
Jan 23 09:55:56 compute-2 sudo[94718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:56 compute-2 python3.9[94720]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:55:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:56 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:56 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:56 compute-2 sudo[94718]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:56 compute-2 ceph-mon[75771]: pgmap v72: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 1.2 KiB/s wr, 2 op/s
Jan 23 09:55:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:56 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:57 compute-2 sudo[94747]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:55:57 compute-2 sudo[94747]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:55:57 compute-2 sudo[94747]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:57 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:55:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:57.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:55:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:55:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:57.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:55:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:57 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008002720 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:57 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:57 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:58 compute-2 sshd-session[85230]: Connection closed by 192.168.122.30 port 59756
Jan 23 09:55:58 compute-2 sshd-session[85227]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:55:58 compute-2 systemd[1]: session-37.scope: Deactivated successfully.
Jan 23 09:55:58 compute-2 systemd[1]: session-37.scope: Consumed 1min 15.079s CPU time.
Jan 23 09:55:58 compute-2 systemd-logind[786]: Session 37 logged out. Waiting for processes to exit.
Jan 23 09:55:58 compute-2 systemd-logind[786]: Removed session 37.
Jan 23 09:55:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:58 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:58 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:58 compute-2 ceph-mon[75771]: pgmap v73: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 1.2 KiB/s wr, 3 op/s
Jan 23 09:55:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:58 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:59 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:59.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:55:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:59.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:59 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008002720 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:59 2026: (VI_0) received an invalid passwd!
Jan 23 09:55:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:59 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:00 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:00 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:00 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:56:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:00 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003e30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:00 compute-2 ceph-mon[75771]: pgmap v74: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 2 op/s
Jan 23 09:56:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:01 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003e30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:56:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:01.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:56:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:56:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:01.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:56:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:01 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:01 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:01 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:02 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:02 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:02 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:03 compute-2 sudo[94778]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:56:03 compute-2 sudo[94778]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:56:03 compute-2 sudo[94778]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:03 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:03 compute-2 sudo[94803]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 09:56:03 compute-2 sudo[94803]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:56:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:03.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:03 compute-2 ceph-mon[75771]: pgmap v75: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 682 B/s wr, 2 op/s
Jan 23 09:56:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:03.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:03 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:03 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:03 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:03 compute-2 sudo[94803]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:04 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:04 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:04 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:05 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:56:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:05.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:56:05 compute-2 ceph-mon[75771]: pgmap v76: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 09:56:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:56:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:05.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:05 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:05 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:05 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:05 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:56:06 compute-2 sshd-session[94861]: Accepted publickey for zuul from 192.168.122.30 port 33340 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:56:06 compute-2 systemd-logind[786]: New session 39 of user zuul.
Jan 23 09:56:06 compute-2 systemd[1]: Started Session 39 of User zuul.
Jan 23 09:56:06 compute-2 sshd-session[94861]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:56:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:06 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:06 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:06 compute-2 ceph-mon[75771]: pgmap v77: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 09:56:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:06 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:07 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:07 compute-2 python3.9[95015]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:56:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:07.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:07.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:07 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:07 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:07 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:08 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:08 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:08 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:09 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003e90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:09.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:09 compute-2 sudo[95171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtmvjhptqqbjapxtywsdqdurufnfiesh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162169.0135682-66-49809235277350/AnsiballZ_getent.py'
Jan 23 09:56:09 compute-2 sudo[95171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:56:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:09.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:56:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:09 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:09 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:09 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:09 compute-2 python3.9[95173]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 23 09:56:09 compute-2 sudo[95171]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:09 compute-2 ceph-mon[75771]: pgmap v78: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 23 09:56:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:10 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:10 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:10 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:56:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:10 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:11 compute-2 sudo[95326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlvhmkbruebdbuhegrwvcbbkokutahxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162170.7414374-101-12440189610515/AnsiballZ_setup.py'
Jan 23 09:56:11 compute-2 sudo[95326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:11 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:11.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:11 compute-2 python3.9[95328]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:56:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:11.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:11 compute-2 sudo[95326]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:11 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003eb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:11 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:11 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:11 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:56:11 compute-2 ceph-mon[75771]: pgmap v79: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:56:11 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:56:12 compute-2 sudo[95411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guirrmjuiqvywakolvtqmpepgnkigtcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162170.7414374-101-12440189610515/AnsiballZ_dnf.py'
Jan 23 09:56:12 compute-2 sudo[95411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:12 compute-2 python3.9[95413]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 09:56:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:12 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:12 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:12 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:56:12 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 09:56:12 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:56:12 compute-2 ceph-mon[75771]: pgmap v80: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 09:56:12 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:56:12 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 09:56:12 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 09:56:12 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:56:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:12 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:13 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:56:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:13.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:56:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:56:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:13.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:56:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:13 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:13 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:13 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:13 compute-2 sudo[95411]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:14 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:14 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:14 compute-2 sudo[95567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsqzpaedltpefungkoqhqfmazlyazlcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162174.4470494-143-95293559768308/AnsiballZ_dnf.py'
Jan 23 09:56:14 compute-2 sudo[95567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:14 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:14 compute-2 python3.9[95569]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:56:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:15 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:56:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:15.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:56:15 compute-2 ceph-mon[75771]: pgmap v81: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:56:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:15.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:15 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:15 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:15 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:15 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:56:16 compute-2 ceph-mon[75771]: pgmap v82: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:56:16 compute-2 sudo[95567]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:16 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:16 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:16 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:17 compute-2 sudo[95651]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:56:17 compute-2 sudo[95651]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:56:17 compute-2 sudo[95651]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:17 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:56:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:17.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:56:17 compute-2 sudo[95749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxuyhoowydvsmfqdbhyyraobwqsunzre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162176.7430587-167-155876800063140/AnsiballZ_systemd.py'
Jan 23 09:56:17 compute-2 sudo[95749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:17.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:17 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff0240008d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:17 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:17 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:17 compute-2 python3.9[95751]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 09:56:17 compute-2 sudo[95749]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:18 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:18 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:18 compute-2 python3.9[95906]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:56:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:18 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:19 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:19.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:19 compute-2 sudo[96056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyhdmhpfellcfvpelfzohlqyffmeoiaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162179.079614-221-155030084699424/AnsiballZ_sefcontext.py'
Jan 23 09:56:19 compute-2 sudo[96056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:19 compute-2 ceph-mon[75771]: pgmap v83: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 09:56:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:56:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:19.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:56:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:19 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:19 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:19 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:19 compute-2 python3.9[96058]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 23 09:56:20 compute-2 sudo[96056]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:20 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:20 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:20 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:56:20 compute-2 python3.9[96210]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:56:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:20 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff0240008d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:21 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:21.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:21.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:21 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:21 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:21 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:21 compute-2 sudo[96366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzkacgctappbjyceeezlihtlpqwqzrki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162181.4580665-275-25961409684503/AnsiballZ_dnf.py'
Jan 23 09:56:21 compute-2 sudo[96366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:21 compute-2 ceph-mon[75771]: pgmap v84: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:56:21 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:56:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095621 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 09:56:21 compute-2 python3.9[96368]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:56:22 compute-2 sudo[96372]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 09:56:22 compute-2 sudo[96372]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:56:22 compute-2 sudo[96372]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:22 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:22 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:22 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:23 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff0240008d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:23 compute-2 ceph-mon[75771]: pgmap v85: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 09:56:23 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:56:23 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:56:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:56:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:23.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:56:23 compute-2 sudo[96366]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:23.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:23 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:23 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:23 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:24 compute-2 sudo[96548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnscnnhhsvoikohlhfqeybhwwobumpsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162183.921974-299-188753836911075/AnsiballZ_command.py'
Jan 23 09:56:24 compute-2 sudo[96548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:24 compute-2 python3.9[96550]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:56:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:24 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:24 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:24 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:25 compute-2 sudo[96548]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:25 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:25.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:25 compute-2 ceph-mon[75771]: pgmap v86: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:56:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:56:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:25.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:56:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:25 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:25 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002a50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:25 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:25 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:56:25 compute-2 sudo[96836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tklcibmdmjwjhobenmitopnaumosvtht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162185.5225806-323-279308800105146/AnsiballZ_file.py'
Jan 23 09:56:25 compute-2 sudo[96836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:26 compute-2 python3.9[96838]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 23 09:56:26 compute-2 sudo[96836]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:26 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:26 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:26 compute-2 python3.9[96989]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:56:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:26 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:27 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:56:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:27.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:56:27 compute-2 ceph-mon[75771]: pgmap v87: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:56:27 compute-2 sudo[97141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydlckkkzjjkurbanwhchxstkbfbzpfwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162187.281866-371-200902010096236/AnsiballZ_dnf.py'
Jan 23 09:56:27 compute-2 sudo[97141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:27 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:56:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:27.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:56:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:27 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:27 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:27 compute-2 python3.9[97143]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:56:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:28 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:28 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:28 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002a50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:29 compute-2 ceph-mon[75771]: pgmap v88: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:56:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:56:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:29.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:56:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:29 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:29 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:56:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:29.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:56:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:30 compute-2 sudo[97141]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:30 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:30 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:30 compute-2 sudo[97298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptjgwidgbttewxyqbzixpejknesgkqmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162190.4732006-398-2999097896826/AnsiballZ_dnf.py'
Jan 23 09:56:30 compute-2 sudo[97298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:30 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:56:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:30 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:31 compute-2 python3.9[97300]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:56:31 compute-2 ceph-mon[75771]: pgmap v89: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 09:56:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:31 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024003760 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:56:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:31.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:56:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:31 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:31 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:31.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:31 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:32 compute-2 sudo[97298]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:32 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:32 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:32 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:33 compute-2 ceph-mon[75771]: pgmap v90: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:56:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:33 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:56:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:33.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:56:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:33 : epoch 69734548 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:56:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:33 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:33 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:33 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024003760 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.002000051s ======
Jan 23 09:56:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:33.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000051s
Jan 23 09:56:33 compute-2 sudo[97453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imepkidgdwkpbnagazjlyravsjyrrtlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162193.388639-435-199589624656395/AnsiballZ_stat.py'
Jan 23 09:56:33 compute-2 sudo[97453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:33 compute-2 python3.9[97455]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:56:33 compute-2 sudo[97453]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:34 compute-2 sudo[97609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auwmhbsupvtsxgdxkyqwhljagtbtmjol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162194.1588993-459-275125230324639/AnsiballZ_slurp.py'
Jan 23 09:56:34 compute-2 sudo[97609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:34 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:34 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:34 compute-2 python3.9[97611]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Jan 23 09:56:34 compute-2 sudo[97609]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:34 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:35 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:35 compute-2 ceph-mon[75771]: pgmap v91: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:56:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:56:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:35.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:35 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:35 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:35 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:35.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:35 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:56:35 compute-2 sshd-session[94864]: Connection closed by 192.168.122.30 port 33340
Jan 23 09:56:35 compute-2 sshd-session[94861]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:56:35 compute-2 systemd[1]: session-39.scope: Deactivated successfully.
Jan 23 09:56:35 compute-2 systemd[1]: session-39.scope: Consumed 19.026s CPU time.
Jan 23 09:56:35 compute-2 systemd-logind[786]: Session 39 logged out. Waiting for processes to exit.
Jan 23 09:56:35 compute-2 systemd-logind[786]: Removed session 39.
Jan 23 09:56:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:36 : epoch 69734548 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 09:56:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:36 : epoch 69734548 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:56:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:36 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:36 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:36 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024003760 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:37 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024003760 fd 48 proxy ignored for local
Jan 23 09:56:37 compute-2 kernel: ganesha.nfsd[95571]: segfault at 50 ip 00007ff0bf73232e sp 00007ff052ffc210 error 4 in libntirpc.so.5.8[7ff0bf717000+2c000] likely on CPU 5 (core 0, socket 5)
Jan 23 09:56:37 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 09:56:37 compute-2 ceph-mon[75771]: pgmap v92: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:56:37 compute-2 systemd[1]: Started Process Core Dump (PID 97645/UID 0).
Jan 23 09:56:37 compute-2 sudo[97638]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:56:37 compute-2 sudo[97638]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:56:37 compute-2 sudo[97638]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:56:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:37.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:56:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:37 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:37 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:37.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:38 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:38 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:38 compute-2 systemd-coredump[97661]: Process 85734 (ganesha.nfsd) of user 0 dumped core.
                                                   
                                                   Stack trace of thread 62:
                                                   #0  0x00007ff0bf73232e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                   ELF object binary architecture: AMD x86-64
Jan 23 09:56:38 compute-2 systemd[1]: systemd-coredump@1-97645-0.service: Deactivated successfully.
Jan 23 09:56:38 compute-2 systemd[1]: systemd-coredump@1-97645-0.service: Consumed 1.656s CPU time.
Jan 23 09:56:39 compute-2 podman[97671]: 2026-01-23 09:56:39.027166118 +0000 UTC m=+0.031797858 container died 2988332eba883a20d1059e3b3728d769fb16998f721e57e41af0869ab3b7663e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid)
Jan 23 09:56:39 compute-2 systemd[1]: var-lib-containers-storage-overlay-a90f8a234e18b05e243a2b45741ab580a2f24f36b6337b5ad8040626fe6cbe4d-merged.mount: Deactivated successfully.
Jan 23 09:56:39 compute-2 systemd[77796]: Created slice User Background Tasks Slice.
Jan 23 09:56:39 compute-2 systemd[77796]: Starting Cleanup of User's Temporary Files and Directories...
Jan 23 09:56:39 compute-2 podman[97671]: 2026-01-23 09:56:39.069839081 +0000 UTC m=+0.074470801 container remove 2988332eba883a20d1059e3b3728d769fb16998f721e57e41af0869ab3b7663e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 09:56:39 compute-2 systemd[77796]: Finished Cleanup of User's Temporary Files and Directories.
Jan 23 09:56:39 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Main process exited, code=exited, status=139/n/a
Jan 23 09:56:39 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 09:56:39 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 2.209s CPU time.
Jan 23 09:56:39 compute-2 ceph-mon[75771]: pgmap v93: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 09:56:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:56:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:39.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:56:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:39 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:39 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:56:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:39.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:56:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:40 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:40 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:40 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:56:41 compute-2 ceph-mon[75771]: pgmap v94: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 09:56:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:56:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:41.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:56:41 compute-2 sshd-session[97719]: Accepted publickey for zuul from 192.168.122.30 port 47586 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:56:41 compute-2 systemd-logind[786]: New session 40 of user zuul.
Jan 23 09:56:41 compute-2 systemd[1]: Started Session 40 of User zuul.
Jan 23 09:56:41 compute-2 sshd-session[97719]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:56:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:41 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:41 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:41.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:42 compute-2 python3.9[97873]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:56:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:42 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:42 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095643 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 09:56:43 compute-2 ceph-mon[75771]: pgmap v95: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 09:56:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:43.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:43 compute-2 python3.9[98028]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:56:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:43 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:43 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:43.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:44 compute-2 ceph-mon[75771]: pgmap v96: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 09:56:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:44 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:44 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:44 compute-2 python3.9[98223]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:56:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:56:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:45.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:56:45 compute-2 sshd-session[97722]: Connection closed by 192.168.122.30 port 47586
Jan 23 09:56:45 compute-2 sshd-session[97719]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:56:45 compute-2 systemd[1]: session-40.scope: Deactivated successfully.
Jan 23 09:56:45 compute-2 systemd[1]: session-40.scope: Consumed 2.271s CPU time.
Jan 23 09:56:45 compute-2 systemd-logind[786]: Session 40 logged out. Waiting for processes to exit.
Jan 23 09:56:45 compute-2 systemd-logind[786]: Removed session 40.
Jan 23 09:56:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:45 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:45 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:56:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:45.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:56:45 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:56:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095645 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 09:56:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:46 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:46 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:46 compute-2 ceph-mon[75771]: pgmap v97: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 09:56:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:56:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:47.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:56:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:47 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:47 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:56:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:47.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:56:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:48 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:48 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:49 compute-2 ceph-mon[75771]: pgmap v98: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 09:56:49 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Scheduled restart job, restart counter is at 2.
Jan 23 09:56:49 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:56:49 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 2.209s CPU time.
Jan 23 09:56:49 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:56:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:49.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:49 compute-2 podman[98300]: 2026-01-23 09:56:49.482550706 +0000 UTC m=+0.023799297 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:56:49 compute-2 podman[98300]: 2026-01-23 09:56:49.640893384 +0000 UTC m=+0.182141975 container create d693275e105c73aef6f04d631fc637ff536d8a3b7f8c2d079dfe0bd3e5450fb1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, OSD_FLAVOR=default)
Jan 23 09:56:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:49 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:49 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:49.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:49 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b19ca06aee9ccd666281d5d22cdbe211ab7c13b8da2e9314aa375f7fd13de7bf/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 09:56:49 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b19ca06aee9ccd666281d5d22cdbe211ab7c13b8da2e9314aa375f7fd13de7bf/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 09:56:49 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b19ca06aee9ccd666281d5d22cdbe211ab7c13b8da2e9314aa375f7fd13de7bf/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:56:49 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b19ca06aee9ccd666281d5d22cdbe211ab7c13b8da2e9314aa375f7fd13de7bf/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.tykohi-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 09:56:50 compute-2 podman[98300]: 2026-01-23 09:56:50.285540153 +0000 UTC m=+0.826788744 container init d693275e105c73aef6f04d631fc637ff536d8a3b7f8c2d079dfe0bd3e5450fb1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Jan 23 09:56:50 compute-2 podman[98300]: 2026-01-23 09:56:50.290536854 +0000 UTC m=+0.831785425 container start d693275e105c73aef6f04d631fc637ff536d8a3b7f8c2d079dfe0bd3e5450fb1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid)
Jan 23 09:56:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:50 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 09:56:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:50 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 09:56:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:56:50 compute-2 bash[98300]: d693275e105c73aef6f04d631fc637ff536d8a3b7f8c2d079dfe0bd3e5450fb1
Jan 23 09:56:50 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:56:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:50 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 09:56:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:50 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 09:56:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:50 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 09:56:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:50 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 09:56:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:50 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 09:56:50 compute-2 sshd-session[98329]: Accepted publickey for zuul from 192.168.122.30 port 36094 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:56:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:50 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:56:50 compute-2 systemd-logind[786]: New session 41 of user zuul.
Jan 23 09:56:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095650 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 09:56:50 compute-2 systemd[1]: Started Session 41 of User zuul.
Jan 23 09:56:50 compute-2 sshd-session[98329]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:56:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:50 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:50 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:50 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:56:51 compute-2 ceph-mon[75771]: pgmap v99: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 23 09:56:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:51.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:51 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:51 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:51 compute-2 python3.9[98512]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:56:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:56:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:51.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:56:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:52 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:52 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:52 compute-2 python3.9[98668]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:56:53 compute-2 ceph-mon[75771]: pgmap v100: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.6 KiB/s rd, 511 B/s wr, 2 op/s
Jan 23 09:56:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:53.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:53 compute-2 sudo[98822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfbamtpukrwprckismlqihuxwesgwixf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162213.1270432-77-253514809655158/AnsiballZ_setup.py'
Jan 23 09:56:53 compute-2 sudo[98822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:53 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:53 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:53.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:53 compute-2 python3.9[98824]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:56:53 compute-2 sudo[98822]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:54 compute-2 sudo[98908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmfkrmkovdvkviiojhfzoyedgesbvwrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162213.1270432-77-253514809655158/AnsiballZ_dnf.py'
Jan 23 09:56:54 compute-2 sudo[98908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:54 compute-2 ceph-mon[75771]: pgmap v101: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:56:54 compute-2 python3.9[98910]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:56:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:54 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:54 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:56:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:55.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:56:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:55 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:55 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:55.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:55 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:56:56 compute-2 sudo[98908]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:56 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:56 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:56 compute-2 sudo[99063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yeazqfxpaugfzsgmodjtraqceyiwydol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162216.4007854-113-170553932257456/AnsiballZ_setup.py'
Jan 23 09:56:56 compute-2 sudo[99063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:56 compute-2 python3.9[99065]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:56:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:56 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 09:56:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:56 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:56:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:56 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 09:56:57 compute-2 sudo[99063]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:57 compute-2 sudo[99121]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:56:57 compute-2 sudo[99121]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:56:57 compute-2 sudo[99121]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:57 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:56:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:57 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 09:56:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:57 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:56:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:56:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:57.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:56:57 compute-2 ceph-mon[75771]: pgmap v102: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:56:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:57 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:57 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:56:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:57.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:56:58 compute-2 sudo[99285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvhkjbfogtzajzojgnwopwjptnwxcjaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162217.8758588-146-72069066728584/AnsiballZ_file.py'
Jan 23 09:56:58 compute-2 sudo[99285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:58 compute-2 python3.9[99287]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:56:58 compute-2 sudo[99285]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:58 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:58 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:58 compute-2 ceph-mon[75771]: pgmap v103: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 426 B/s wr, 1 op/s
Jan 23 09:56:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:56:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:59.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:56:59 compute-2 sudo[99437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhsqbrmdlgxtaitypglcvvzekofbxoeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162219.228201-171-15359597691639/AnsiballZ_command.py'
Jan 23 09:56:59 compute-2 sudo[99437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:59 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:59 2026: (VI_0) received an invalid passwd!
Jan 23 09:56:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:56:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:56:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:59.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:56:59 compute-2 python3.9[99439]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:56:59 compute-2 sudo[99437]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:00 compute-2 sudo[99605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktpytweioprahclcatcgvjfzihwsqdqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162220.1996427-194-231477437919247/AnsiballZ_stat.py'
Jan 23 09:57:00 compute-2 sudo[99605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:00 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:00 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:00 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:57:00 compute-2 python3.9[99607]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:57:00 compute-2 sudo[99605]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:01 compute-2 sudo[99683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlxjitqdwfvvhrahhxtbrislhsngketg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162220.1996427-194-231477437919247/AnsiballZ_file.py'
Jan 23 09:57:01 compute-2 sudo[99683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:01 compute-2 python3.9[99685]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:57:01 compute-2 sudo[99683]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:57:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:01.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:57:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:01 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:01 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:01.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:01 compute-2 ceph-mon[75771]: pgmap v104: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 426 B/s wr, 1 op/s
Jan 23 09:57:02 compute-2 sudo[99836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xiunhqgwwmjbrmjxasheufifwxyveptf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162221.8013093-230-174273868827291/AnsiballZ_stat.py'
Jan 23 09:57:02 compute-2 sudo[99836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:02 compute-2 python3.9[99838]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:57:02 compute-2 sudo[99836]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:02 compute-2 sudo[99915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdxtppjzfksuravmrlofwqeifldmmhbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162221.8013093-230-174273868827291/AnsiballZ_file.py'
Jan 23 09:57:02 compute-2 sudo[99915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:02 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:02 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:02 compute-2 python3.9[99917]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:57:02 compute-2 sudo[99915]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:02 compute-2 ceph-mon[75771]: pgmap v105: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 767 B/s wr, 3 op/s
Jan 23 09:57:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:03.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:03 compute-2 sudo[100067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yitigovvtrmzyiamjycdhwpgpqbzxpqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162222.9943566-269-233971443973792/AnsiballZ_ini_file.py'
Jan 23 09:57:03 compute-2 sudo[100067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:03 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:03 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:03 compute-2 python3.9[100069]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:57:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:57:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:03.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:57:03 compute-2 sudo[100067]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:04 compute-2 sudo[100220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buexqmahphmpezxrjqomsyzzamajyooc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162223.8543236-269-198286446588064/AnsiballZ_ini_file.py'
Jan 23 09:57:04 compute-2 sudo[100220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:04 compute-2 python3.9[100222]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:57:04 compute-2 sudo[100220]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:04 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:04 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:04 compute-2 sudo[100373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ordsmdthudtxrubgjfrmwwxjfxqbwbmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162224.4618473-269-81309652671037/AnsiballZ_ini_file.py'
Jan 23 09:57:04 compute-2 sudo[100373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 09:57:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:57:04 compute-2 python3.9[100375]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:57:04 compute-2 sudo[100373]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:05 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd49c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:05 compute-2 ceph-mon[75771]: pgmap v106: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 682 B/s wr, 2 op/s
Jan 23 09:57:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:57:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:05 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490001c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:05.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:05 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:05 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:05 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:57:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:05.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:57:05 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:57:06 compute-2 sudo[100541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvcarbivxnueynpkxsjfmmonclbjajyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162225.0407107-269-263655825546632/AnsiballZ_ini_file.py'
Jan 23 09:57:06 compute-2 sudo[100541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:06 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:06 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:06 compute-2 python3.9[100543]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:57:06 compute-2 sudo[100541]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:06 compute-2 ceph-mon[75771]: pgmap v107: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 682 B/s wr, 2 op/s
Jan 23 09:57:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:07 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095707 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 09:57:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:07 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:07 compute-2 sudo[100693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcofzzhjugvwnfzcnlebsnvxnypjmavj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162227.1163363-362-144667316387501/AnsiballZ_dnf.py'
Jan 23 09:57:07 compute-2 sudo[100693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:07.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:07 compute-2 python3.9[100695]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:57:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:07 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:07 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:07 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:57:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:07.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:57:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:07 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 09:57:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:07 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:57:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:08 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:09 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:09 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:09 compute-2 ceph-mon[75771]: pgmap v108: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 3.7 KiB/s rd, 1.5 KiB/s wr, 5 op/s
Jan 23 09:57:09 compute-2 sudo[100693]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:09 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:57:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:09.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:57:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:09 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:09 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:09.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:10 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:10 compute-2 sudo[100849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcmhvcayyznfzfwcqbvgmmzexweywoub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162229.890795-395-206608586663514/AnsiballZ_setup.py'
Jan 23 09:57:10 compute-2 sudo[100849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:10 compute-2 python3.9[100851]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:57:10 compute-2 sudo[100849]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:10 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:10 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:57:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:10 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 09:57:10 compute-2 sudo[101004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qllftionhhlcrbtivflsvogiobidpfqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162230.7219846-419-206051253813524/AnsiballZ_stat.py'
Jan 23 09:57:10 compute-2 sudo[101004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:11 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478001b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:11 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:11 compute-2 ceph-mon[75771]: pgmap v109: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 3.3 KiB/s rd, 1.2 KiB/s wr, 4 op/s
Jan 23 09:57:11 compute-2 python3.9[101006]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:57:11 compute-2 sudo[101004]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:11 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:11.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:11 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:11 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:11.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:11 compute-2 sudo[101156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztfagjgvgzcncbhnaidjhistopwuqlpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162231.493839-446-116922286385246/AnsiballZ_stat.py'
Jan 23 09:57:11 compute-2 sudo[101156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:11 compute-2 python3.9[101158]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:57:11 compute-2 sudo[101156]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:12 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:12 compute-2 sudo[101310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcsuzricqjfxpyboxetegdjucatkwvka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162232.3269103-476-67170801781475/AnsiballZ_command.py'
Jan 23 09:57:12 compute-2 sudo[101310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:12 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:12 compute-2 python3.9[101312]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:57:12 compute-2 sudo[101310]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:13 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:13 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:13 compute-2 ceph-mon[75771]: pgmap v110: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 3.9 KiB/s rd, 1.3 KiB/s wr, 5 op/s
Jan 23 09:57:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:13 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478001b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:13.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:13 compute-2 sudo[101463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-entqmkqsmuukasizbtenbrhatpiqucob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162233.1877563-507-230121766500854/AnsiballZ_service_facts.py'
Jan 23 09:57:13 compute-2 sudo[101463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:13 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:13 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:13.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:13 compute-2 python3.9[101465]: ansible-service_facts Invoked
Jan 23 09:57:13 compute-2 network[101482]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 09:57:13 compute-2 network[101483]: 'network-scripts' will be removed from distribution in near future.
Jan 23 09:57:13 compute-2 network[101484]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 09:57:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:14 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095714 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 09:57:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:14 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:15 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:15 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:15 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:57:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:15.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:57:15 compute-2 ceph-mon[75771]: pgmap v111: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 09:57:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:15 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:15 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478001b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:15.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:15 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:57:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:16 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:16 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:17 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:17 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:17 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:17 compute-2 sudo[101539]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:57:17 compute-2 sudo[101539]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:57:17 compute-2 sudo[101539]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:57:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:17.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:57:17 compute-2 ceph-mon[75771]: pgmap v112: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 09:57:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:17 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:17 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:57:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:17.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:57:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:18 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:18 compute-2 sudo[101463]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:18 compute-2 ceph-mon[75771]: pgmap v113: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 09:57:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:18 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:19 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:19 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:19 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:19.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:19 compute-2 sudo[101798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqhhcwhskmiyrayrnuvdwpmxnsglifoj ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1769162239.336996-552-121863653153142/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1769162239.336996-552-121863653153142/args'
Jan 23 09:57:19 compute-2 sudo[101798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:19 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:19 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:19 compute-2 sudo[101798]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:57:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:19.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:57:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:20 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:20 compute-2 sudo[101967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvinwczvxjyfsawaisorabwtqnadynnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162240.0971506-585-54842424262499/AnsiballZ_dnf.py'
Jan 23 09:57:20 compute-2 sudo[101967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:20 compute-2 python3.9[101969]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:57:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:20 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:20 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:57:20 compute-2 ceph-mon[75771]: pgmap v114: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 170 B/s wr, 0 op/s
Jan 23 09:57:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:57:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:21 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:21 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:21 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:57:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:21.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:57:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:21 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:21 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:21.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:22 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:22 compute-2 sudo[101967]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:22 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:22 compute-2 sudo[101997]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:57:22 compute-2 sudo[101997]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:57:22 compute-2 sudo[101997]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:22 compute-2 sudo[102022]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 09:57:22 compute-2 sudo[102022]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:57:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:23 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:23 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:23 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:23 compute-2 ceph-mon[75771]: pgmap v115: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 170 B/s wr, 1 op/s
Jan 23 09:57:23 compute-2 sudo[102022]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:57:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:23.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:57:23 compute-2 sudo[102203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-raefezmlejxksurdyzygrmzvrwyaqmal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162242.8849049-624-65517615627072/AnsiballZ_package_facts.py'
Jan 23 09:57:23 compute-2 sudo[102203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:23 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:23 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:23.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:23 compute-2 python3.9[102205]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 23 09:57:24 compute-2 sudo[102203]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:24 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:24 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:57:24 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 09:57:24 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:57:24 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:57:24 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 09:57:24 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 09:57:24 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:57:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:24 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:25 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:25 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:25 compute-2 sudo[102357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiamdtzvlzghhwwyyycbknucpvujoypc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162244.8495538-654-86505023744295/AnsiballZ_stat.py'
Jan 23 09:57:25 compute-2 sudo[102357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:25 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:25 compute-2 python3.9[102359]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:57:25 compute-2 ceph-mon[75771]: pgmap v116: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:25 compute-2 sudo[102357]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:25.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:25 compute-2 sudo[102435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esnxlinasibkvtnjimvyukrxaoqiqnzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162244.8495538-654-86505023744295/AnsiballZ_file.py'
Jan 23 09:57:25 compute-2 sudo[102435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:25 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:25 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:57:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:25.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:57:25 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:57:25 compute-2 python3.9[102437]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:57:25 compute-2 sudo[102435]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:26 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:26 compute-2 sudo[102589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhrvuozirplipqgrvoojdjvjnxsvvdpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162246.1468709-691-30147273152756/AnsiballZ_stat.py'
Jan 23 09:57:26 compute-2 sudo[102589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:26 compute-2 python3.9[102591]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:57:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:26 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:26 compute-2 sudo[102589]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:26 compute-2 sudo[102667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnqgosimpqlzvtktqhtvvazdcryocmlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162246.1468709-691-30147273152756/AnsiballZ_file.py'
Jan 23 09:57:26 compute-2 sudo[102667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:27 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:27 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:27 compute-2 python3.9[102669]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:57:27 compute-2 sudo[102667]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:27 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:27.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:27 compute-2 ceph-mon[75771]: pgmap v117: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:27 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:27 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:27.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:28 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:28 compute-2 ceph-mon[75771]: pgmap v118: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:28 compute-2 sudo[102821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cffapjpqqlxqlfeikrdjpngqgnxgtgzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162248.2218096-746-267828751094149/AnsiballZ_lineinfile.py'
Jan 23 09:57:28 compute-2 sudo[102821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:28 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:28 compute-2 python3.9[102823]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:57:28 compute-2 sudo[102821]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:29 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:29 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:29 compute-2 sudo[102848]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 09:57:29 compute-2 sudo[102848]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:57:29 compute-2 sudo[102848]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:29 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:29.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:29 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:29 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:29.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:30 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:30 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:57:30 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:57:30 compute-2 sudo[103001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqzjwlwhjmmdhnfiltpmvovomzjbqdqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162249.9864123-790-62034885436601/AnsiballZ_setup.py'
Jan 23 09:57:30 compute-2 sudo[103001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:30 compute-2 python3.9[103003]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:57:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:30 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:30 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:57:30 compute-2 sudo[103001]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:31 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:31 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:31 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:31 compute-2 sudo[103085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbxpwxparoetfgrnsbuwwvnjcxwvorur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162249.9864123-790-62034885436601/AnsiballZ_systemd.py'
Jan 23 09:57:31 compute-2 sudo[103085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:57:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:31.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:57:31 compute-2 python3.9[103087]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:57:31 compute-2 sudo[103085]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:31 compute-2 ceph-mon[75771]: pgmap v119: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:31 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:31 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:31.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:32 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:32 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:32 compute-2 sshd-session[98362]: Connection closed by 192.168.122.30 port 36094
Jan 23 09:57:32 compute-2 sshd-session[98329]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:57:32 compute-2 systemd[1]: session-41.scope: Deactivated successfully.
Jan 23 09:57:32 compute-2 systemd[1]: session-41.scope: Consumed 24.263s CPU time.
Jan 23 09:57:32 compute-2 systemd-logind[786]: Session 41 logged out. Waiting for processes to exit.
Jan 23 09:57:32 compute-2 systemd-logind[786]: Removed session 41.
Jan 23 09:57:32 compute-2 ceph-mon[75771]: pgmap v120: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 09:57:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:33 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:33 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:33 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:57:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:33.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:57:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:33 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:33 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:33.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:34 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:34 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:34 compute-2 ceph-mon[75771]: pgmap v121: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:35 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:35 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:35 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:57:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:35.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:57:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:35 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:35 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:35.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:57:35 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:57:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:36 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:36 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:36 compute-2 ceph-mon[75771]: pgmap v122: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:37 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:37 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:37 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:57:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:37.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:57:37 compute-2 sudo[103123]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:57:37 compute-2 sudo[103123]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:57:37 compute-2 sudo[103123]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:37 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:37 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:37.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:38 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:38 compute-2 sshd-session[103149]: Accepted publickey for zuul from 192.168.122.30 port 40712 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:57:38 compute-2 systemd-logind[786]: New session 42 of user zuul.
Jan 23 09:57:38 compute-2 systemd[1]: Started Session 42 of User zuul.
Jan 23 09:57:38 compute-2 sshd-session[103149]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:57:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:38 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:38 compute-2 sudo[103303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbbtrinjrkxzwtyektvipppeyodiyzgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162258.2780037-23-211189675134802/AnsiballZ_file.py'
Jan 23 09:57:38 compute-2 sudo[103303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:38 compute-2 python3.9[103305]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:57:38 compute-2 ceph-mon[75771]: pgmap v123: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:38 compute-2 sudo[103303]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:39 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:39 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:39 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:39.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:39 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:39 compute-2 sudo[103455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxilchfxsqzewnwdwiorqbczjkatcipy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162259.2352643-59-259680500176115/AnsiballZ_stat.py'
Jan 23 09:57:39 compute-2 sudo[103455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:39 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:57:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:39.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:57:39 compute-2 python3.9[103457]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:57:39 compute-2 sudo[103455]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:40 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:40 compute-2 sudo[103534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyvylqlehlgievrfxmzegmwvmzqhrpsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162259.2352643-59-259680500176115/AnsiballZ_file.py'
Jan 23 09:57:40 compute-2 sudo[103534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:40 compute-2 python3.9[103536]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:57:40 compute-2 sudo[103534]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:40 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:40 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:57:40 compute-2 sshd-session[103152]: Connection closed by 192.168.122.30 port 40712
Jan 23 09:57:40 compute-2 sshd-session[103149]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:57:40 compute-2 systemd[1]: session-42.scope: Deactivated successfully.
Jan 23 09:57:40 compute-2 systemd[1]: session-42.scope: Consumed 1.519s CPU time.
Jan 23 09:57:40 compute-2 systemd-logind[786]: Session 42 logged out. Waiting for processes to exit.
Jan 23 09:57:40 compute-2 systemd-logind[786]: Removed session 42.
Jan 23 09:57:41 compute-2 ceph-mon[75771]: pgmap v124: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:41 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488001930 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:41 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:41 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4700016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:57:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:41.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:57:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:41 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:41 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:41.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:42 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:42 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:43 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:43 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:43 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488001930 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:43.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:43 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:43 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4700016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:57:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:43.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:57:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:44 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:44 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:45 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:45 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:45 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:57:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:45.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:57:45 compute-2 ceph-mon[75771]: pgmap v125: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 09:57:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:45 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:45 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488001930 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:45 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:57:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:45.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:46 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:46 compute-2 ceph-mon[75771]: pgmap v126: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:46 compute-2 ceph-mon[75771]: pgmap v127: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:46 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:47 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4700016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:47 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:47 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:57:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:47.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:57:47 compute-2 sshd-session[103568]: Accepted publickey for zuul from 192.168.122.30 port 53474 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:57:47 compute-2 systemd-logind[786]: New session 43 of user zuul.
Jan 23 09:57:47 compute-2 systemd[1]: Started Session 43 of User zuul.
Jan 23 09:57:47 compute-2 sshd-session[103568]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:57:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:47 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:47 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:47.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:48 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:48 compute-2 python3.9[103722]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:57:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:48 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:49 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488002da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:49 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:49 compute-2 ceph-mon[75771]: pgmap v128: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:49 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:57:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:49.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:57:49 compute-2 sudo[103877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmxnqrapueknyxjgoihwwesyohiqxhvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162269.162436-56-264535277591162/AnsiballZ_file.py'
Jan 23 09:57:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:49 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:49 compute-2 sudo[103877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:49 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:57:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:49.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:57:49 compute-2 python3.9[103879]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:57:49 compute-2 sudo[103877]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:50 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:57:50 compute-2 sudo[104054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npkahgvumobxiromzedmgucgzsvvbpll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162270.097-80-53893340095445/AnsiballZ_stat.py'
Jan 23 09:57:50 compute-2 sudo[104054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:50 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:50 compute-2 python3.9[104056]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:57:50 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:57:50 compute-2 sudo[104054]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:50 compute-2 sudo[104132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apyuanjeshenafignoivdgacaxpcvaco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162270.097-80-53893340095445/AnsiballZ_file.py'
Jan 23 09:57:50 compute-2 sudo[104132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:51 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:51 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:51 compute-2 python3.9[104134]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.f5_stt0h recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:57:51 compute-2 sudo[104132]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:51 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:51.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:51 compute-2 ceph-mon[75771]: pgmap v129: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:51 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:51 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488002da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:51.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:52 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.117093) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162272117337, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1429, "num_deletes": 252, "total_data_size": 4156598, "memory_usage": 4203608, "flush_reason": "Manual Compaction"}
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162272135863, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 1767345, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10521, "largest_seqno": 11944, "table_properties": {"data_size": 1762718, "index_size": 2087, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11771, "raw_average_key_size": 20, "raw_value_size": 1752658, "raw_average_value_size": 2980, "num_data_blocks": 94, "num_entries": 588, "num_filter_entries": 588, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162145, "oldest_key_time": 1769162145, "file_creation_time": 1769162272, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 18796 microseconds, and 7421 cpu microseconds.
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.135963) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 1767345 bytes OK
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.136001) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.138574) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.138608) EVENT_LOG_v1 {"time_micros": 1769162272138602, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.138634) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 4149955, prev total WAL file size 4149955, number of live WAL files 2.
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.140245) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323533' seq:0, type:0; will stop at (end)
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(1725KB)], [21(13MB)]
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162272140586, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 15548968, "oldest_snapshot_seqno": -1}
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 4244 keys, 13455121 bytes, temperature: kUnknown
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162272256073, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 13455121, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13422807, "index_size": 20620, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10629, "raw_key_size": 107932, "raw_average_key_size": 25, "raw_value_size": 13341254, "raw_average_value_size": 3143, "num_data_blocks": 884, "num_entries": 4244, "num_filter_entries": 4244, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769162272, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.256657) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 13455121 bytes
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.258367) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 134.5 rd, 116.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 13.1 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(16.4) write-amplify(7.6) OK, records in: 4708, records dropped: 464 output_compression: NoCompression
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.258388) EVENT_LOG_v1 {"time_micros": 1769162272258378, "job": 10, "event": "compaction_finished", "compaction_time_micros": 115563, "compaction_time_cpu_micros": 52610, "output_level": 6, "num_output_files": 1, "total_output_size": 13455121, "num_input_records": 4708, "num_output_records": 4244, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162272259029, "job": 10, "event": "table_file_deletion", "file_number": 23}
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162272261459, "job": 10, "event": "table_file_deletion", "file_number": 21}
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.139799) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.261489) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.261493) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.261495) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.261496) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:57:52 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.261498) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:57:52 compute-2 sudo[104286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncepaktrkdlpknxeynxczjojcxxuwtuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162272.0811908-140-222042482454199/AnsiballZ_stat.py'
Jan 23 09:57:52 compute-2 sudo[104286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:52 compute-2 python3.9[104288]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:57:52 compute-2 sudo[104286]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:52 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:52 compute-2 sudo[104364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmfgqreaqsjvznztqzvddlcenlutcckd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162272.0811908-140-222042482454199/AnsiballZ_file.py'
Jan 23 09:57:52 compute-2 sudo[104364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:53 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:53 compute-2 python3.9[104366]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.c2vsg48p recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:57:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:53 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:53 compute-2 sudo[104364]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:53 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:53 compute-2 ceph-mon[75771]: pgmap v130: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 09:57:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:53.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:53 compute-2 sudo[104516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuqqgjupckslmzmgynmhniqhiegqvmaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162273.3289793-180-44268593760809/AnsiballZ_file.py'
Jan 23 09:57:53 compute-2 sudo[104516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:53 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:53 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:53.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:53 compute-2 python3.9[104518]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:57:53 compute-2 sudo[104516]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:54 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:54 compute-2 sudo[104670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igikvlsqwbsomdytabwsttorxvwmsuoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162274.0617692-204-181176811482087/AnsiballZ_stat.py'
Jan 23 09:57:54 compute-2 sudo[104670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:54 compute-2 python3.9[104672]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:57:54 compute-2 sudo[104670]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:54 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:54 compute-2 sudo[104748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbvclbcfdjmpmdaeafqeyjiyrxnnswar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162274.0617692-204-181176811482087/AnsiballZ_file.py'
Jan 23 09:57:54 compute-2 sudo[104748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:55 compute-2 python3.9[104750]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:57:55 compute-2 sudo[104748]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:55 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488002da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:55 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:55 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:57:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:55.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:57:55 compute-2 sudo[104900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oerajovoszsegswbpbgrplqbqfontyfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162275.2109475-204-37421108402104/AnsiballZ_stat.py'
Jan 23 09:57:55 compute-2 sudo[104900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:55 compute-2 ceph-mon[75771]: pgmap v131: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:55 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:55 compute-2 python3.9[104902]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:57:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:55 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:55 compute-2 sudo[104900]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:55 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:57:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:55.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:56 compute-2 sudo[104979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdkhwjqegkzbebihuvmbqbysbobicqqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162275.2109475-204-37421108402104/AnsiballZ_file.py'
Jan 23 09:57:56 compute-2 sudo[104979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:56 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:56 compute-2 python3.9[104981]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:57:56 compute-2 sudo[104979]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:56 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:56 compute-2 sudo[105132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xztldzldxtgbmugdxrzrxrsrowsqvsub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162276.4216402-273-136145577222072/AnsiballZ_file.py'
Jan 23 09:57:56 compute-2 sudo[105132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:56 compute-2 python3.9[105134]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:57:56 compute-2 sudo[105132]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:57 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:57 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:57 compute-2 ceph-mon[75771]: pgmap v132: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:57 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:57 compute-2 sudo[105284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuhmtzysnoowrabgtynziaobowzshjer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162277.0869489-296-2298929252698/AnsiballZ_stat.py'
Jan 23 09:57:57 compute-2 sudo[105284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:57:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:57.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:57:57 compute-2 sudo[105287]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:57:57 compute-2 python3.9[105286]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:57:57 compute-2 sudo[105287]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:57:57 compute-2 sudo[105287]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:57 compute-2 sudo[105284]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:57 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:57 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:57:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:57.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:57:57 compute-2 sudo[105387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooirtfnpxyalhezoikelgsecxnozcswi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162277.0869489-296-2298929252698/AnsiballZ_file.py'
Jan 23 09:57:57 compute-2 sudo[105387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:58 compute-2 python3.9[105389]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:57:58 compute-2 sudo[105387]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:58 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:58 compute-2 sudo[105541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqofyisnunlqqwkcsxwolfverltjttib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162278.2421234-332-140867753272459/AnsiballZ_stat.py'
Jan 23 09:57:58 compute-2 sudo[105541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:58 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:58 compute-2 python3.9[105543]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:57:58 compute-2 sudo[105541]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:58 compute-2 sudo[105619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clbnveaphjupkmjzrpuihvkuejfamtvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162278.2421234-332-140867753272459/AnsiballZ_file.py'
Jan 23 09:57:58 compute-2 sudo[105619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:59 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:59 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:59 compute-2 python3.9[105621]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:57:59 compute-2 sudo[105619]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:59 compute-2 ceph-mon[75771]: pgmap v133: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:59 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:57:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:59.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:57:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:59 2026: (VI_0) received an invalid passwd!
Jan 23 09:57:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:59 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:57:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:59.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:00 compute-2 sudo[105772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olhbwpvllbmcjsmcxqiweeajddbdxaaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162279.4120424-368-78992634002581/AnsiballZ_systemd.py'
Jan 23 09:58:00 compute-2 sudo[105772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:00 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:00 compute-2 python3.9[105774]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:58:00 compute-2 systemd[1]: Reloading.
Jan 23 09:58:00 compute-2 systemd-sysv-generator[105805]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:58:00 compute-2 systemd-rc-local-generator[105801]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:58:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:00 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:00 compute-2 sudo[105772]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:00 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:58:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:01 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:01 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:01 compute-2 sudo[105962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwmbdggptfwqfzhehtkianqgyavojpeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162280.9358788-393-142375204279015/AnsiballZ_stat.py'
Jan 23 09:58:01 compute-2 sudo[105962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:01 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:01 compute-2 ceph-mon[75771]: pgmap v134: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:58:01 compute-2 python3.9[105964]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:58:01 compute-2 sudo[105962]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:01.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:01 compute-2 sudo[106040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhnqegjjchyviodxzvkiyzmsamaylawa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162280.9358788-393-142375204279015/AnsiballZ_file.py'
Jan 23 09:58:01 compute-2 sudo[106040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:01 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:01 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:58:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:01.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:58:01 compute-2 python3.9[106042]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:01 compute-2 sudo[106040]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:02 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:02 compute-2 sudo[106194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivoxivzkuwhkovlnamaaildpzfiiaiah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162282.0707684-428-192956876552097/AnsiballZ_stat.py'
Jan 23 09:58:02 compute-2 sudo[106194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:02 compute-2 python3.9[106196]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:58:02 compute-2 sudo[106194]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:02 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:02 compute-2 sudo[106272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdkbzyrtsofuimxdhtrdesjbpjbjvgyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162282.0707684-428-192956876552097/AnsiballZ_file.py'
Jan 23 09:58:02 compute-2 sudo[106272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:02 compute-2 python3.9[106274]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:03 compute-2 sudo[106272]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:03 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:03 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:03 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:03 compute-2 ceph-mon[75771]: pgmap v135: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 09:58:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:58:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:03.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:58:03 compute-2 sudo[106424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cavrhbimlzlzxowgxxgxxomedimbnuws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162283.2121637-465-143534649134036/AnsiballZ_systemd.py'
Jan 23 09:58:03 compute-2 sudo[106424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:03 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:03 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:03.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:03 compute-2 python3.9[106426]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:58:03 compute-2 systemd[1]: Reloading.
Jan 23 09:58:03 compute-2 systemd-sysv-generator[106458]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:58:03 compute-2 systemd-rc-local-generator[106455]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:58:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:04 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:04 compute-2 systemd[1]: Starting Create netns directory...
Jan 23 09:58:04 compute-2 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 09:58:04 compute-2 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 09:58:04 compute-2 systemd[1]: Finished Create netns directory.
Jan 23 09:58:04 compute-2 sudo[106424]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:04 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:04 compute-2 ceph-mon[75771]: pgmap v136: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:58:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:05 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:05 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:05 compute-2 python3.9[106619]: ansible-ansible.builtin.service_facts Invoked
Jan 23 09:58:05 compute-2 network[106636]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 09:58:05 compute-2 network[106637]: 'network-scripts' will be removed from distribution in near future.
Jan 23 09:58:05 compute-2 network[106638]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 09:58:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:05 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:58:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:05.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:58:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:05 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:05 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:05 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:58:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:05.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:58:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:06 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:06 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:07 compute-2 ceph-mon[75771]: pgmap v137: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:58:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:07 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:07 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:07 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:07.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:07 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:07 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:58:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:07.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:58:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:08 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:08 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095808 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 09:58:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:09 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:09 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:09 compute-2 sudo[106905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcifbcwagmvmyqulmsbdmdsapjqbpsau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162288.90686-543-188922492019924/AnsiballZ_stat.py'
Jan 23 09:58:09 compute-2 sudo[106905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:09 compute-2 ceph-mon[75771]: pgmap v138: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:58:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:09 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:09 compute-2 python3.9[106907]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:58:09 compute-2 sudo[106905]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:09.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:09 compute-2 sudo[106983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpbgsnzintvyefihoguabjqdxbjkcauw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162288.90686-543-188922492019924/AnsiballZ_file.py'
Jan 23 09:58:09 compute-2 sudo[106983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:09 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:09 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478001f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:58:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:09.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:58:09 compute-2 python3.9[106985]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:09 compute-2 sudo[106983]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:10 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:10 compute-2 sudo[107137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wptxlavazvesxpsnzekdbthckbvehjjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162290.2550864-582-209459215943484/AnsiballZ_file.py'
Jan 23 09:58:10 compute-2 sudo[107137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:10 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:10 compute-2 python3.9[107139]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:10 compute-2 sudo[107137]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:10 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:58:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:11 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:11 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:11 compute-2 sudo[107289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynkqkrfciyckecwjqvfrujaupmzfkqtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162290.974678-606-120511527619966/AnsiballZ_stat.py'
Jan 23 09:58:11 compute-2 sudo[107289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:11 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:11 compute-2 ceph-mon[75771]: pgmap v139: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:58:11 compute-2 python3.9[107291]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:58:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:11 compute-2 sudo[107289]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:58:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:11.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:58:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:11 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:11 compute-2 sudo[107367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umyjopgcpjbksaxktqexwblatqfskovo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162290.974678-606-120511527619966/AnsiballZ_file.py'
Jan 23 09:58:11 compute-2 sudo[107367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:11 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:11.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:11 compute-2 python3.9[107369]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:11 compute-2 sudo[107367]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:12 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:12 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:13 compute-2 sudo[107521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlesjcewykpcgbtotytjxyyrwfckmjxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162292.6384716-651-220117797701407/AnsiballZ_timezone.py'
Jan 23 09:58:13 compute-2 sudo[107521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:13 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478001f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:13 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:13 compute-2 python3.9[107523]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 23 09:58:13 compute-2 systemd[1]: Starting Time & Date Service...
Jan 23 09:58:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:13 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:13 compute-2 systemd[1]: Started Time & Date Service.
Jan 23 09:58:13 compute-2 sudo[107521]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:13.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:13 compute-2 ceph-mon[75771]: pgmap v140: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 09:58:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:13 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:13 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:58:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:13.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:58:14 compute-2 sudo[107678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzdmufypauhwoefaxrfujdblrxbdjkxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162293.7620876-678-149292813353578/AnsiballZ_file.py'
Jan 23 09:58:14 compute-2 sudo[107678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:14 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:14 compute-2 python3.9[107680]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:14 compute-2 sudo[107678]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:14 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:14 compute-2 ceph-mon[75771]: pgmap v141: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 09:58:14 compute-2 sudo[107831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dccxgmmaairapqihjpnsoccgzkrdjhmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162294.445578-702-88982569056262/AnsiballZ_stat.py'
Jan 23 09:58:14 compute-2 sudo[107831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:14 compute-2 python3.9[107833]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:58:14 compute-2 sudo[107831]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:15 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:15 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:15 compute-2 sudo[107909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycbwmrcjthxeembbmbtrgjqlvtizpmqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162294.445578-702-88982569056262/AnsiballZ_file.py'
Jan 23 09:58:15 compute-2 sudo[107909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:15 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478001f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:15 compute-2 python3.9[107911]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:15 compute-2 sudo[107909]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:58:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:15.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:58:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:15 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:15 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478001f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:15 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:58:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:58:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:15.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:58:15 compute-2 sudo[108061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfmmsplianepewttcnqinlmfvepnituc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162295.602858-738-227783786656842/AnsiballZ_stat.py'
Jan 23 09:58:15 compute-2 sudo[108061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:16 compute-2 python3.9[108063]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:58:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:16 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:16 compute-2 sudo[108061]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:16 compute-2 sudo[108141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbtagvubdxlwonjemzezavudtfmfjhrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162295.602858-738-227783786656842/AnsiballZ_file.py'
Jan 23 09:58:16 compute-2 sudo[108141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:16 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:16 compute-2 python3.9[108143]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.6lbejo7y recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:16 compute-2 sudo[108141]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:17 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:17 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:17 compute-2 sudo[108293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adptfbezsekcllxsjqnginjjnoxjgzpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162296.9541066-774-145645975598725/AnsiballZ_stat.py'
Jan 23 09:58:17 compute-2 sudo[108293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:17 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:17 compute-2 python3.9[108295]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:58:17 compute-2 ceph-mon[75771]: pgmap v142: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 09:58:17 compute-2 sudo[108293]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:17.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:17 compute-2 sudo[108325]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:58:17 compute-2 sudo[108325]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:58:17 compute-2 sudo[108325]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:17 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:17 compute-2 sudo[108396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axiycrgfcmsjzbrgqjirszsozcdwnsgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162296.9541066-774-145645975598725/AnsiballZ_file.py'
Jan 23 09:58:17 compute-2 sudo[108396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:17 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:58:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:17.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:58:17 compute-2 python3.9[108398]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:17 compute-2 sudo[108396]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:18 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:18 compute-2 sudo[108550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xoygxvusxgymsgrmdyghakcuqitkquhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162298.1732383-813-19153762832245/AnsiballZ_command.py'
Jan 23 09:58:18 compute-2 sudo[108550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:18 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:18 compute-2 python3.9[108552]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:58:18 compute-2 sudo[108550]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:19 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:58:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:19 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:19 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:19 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:19 compute-2 ceph-mon[75771]: pgmap v143: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:58:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:58:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:19.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:58:19 compute-2 sudo[108703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vykwuyiddzlqmxguyywfifvfbgiwyhgi ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769162299.1125567-837-240157107050740/AnsiballZ_edpm_nftables_from_files.py'
Jan 23 09:58:19 compute-2 sudo[108703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:19 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:19 compute-2 python3[108705]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 09:58:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:19 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:19 compute-2 sudo[108703]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:58:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:19.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:58:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:20 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:20 compute-2 sudo[108857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skpayvxcwxvpjslqnauuwzbcgjsrwstf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162300.0537996-861-26455204841321/AnsiballZ_stat.py'
Jan 23 09:58:20 compute-2 sudo[108857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:20 compute-2 python3.9[108859]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:58:20 compute-2 sudo[108857]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:58:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:20 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:20 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:58:20 compute-2 sudo[108935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjnbfyxwsrasvtsmspjqatanmzukatga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162300.0537996-861-26455204841321/AnsiballZ_file.py'
Jan 23 09:58:20 compute-2 sudo[108935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:21 compute-2 python3.9[108937]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:21 compute-2 sudo[108935]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:21 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:21 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:21 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:21.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:21 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:21 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:21.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:21 compute-2 sudo[109088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzlzxjebocgksvmkqrtzpexbizffmrxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162301.3898644-896-8143406466741/AnsiballZ_stat.py'
Jan 23 09:58:21 compute-2 sudo[109088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:22 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:22 compute-2 python3.9[109090]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:58:22 compute-2 sudo[109088]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:22 compute-2 ceph-mon[75771]: pgmap v144: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:58:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:22 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 09:58:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:22 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:58:22 compute-2 sudo[109214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwbqcasgpljxpmqactqthpdotuxacfeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162301.3898644-896-8143406466741/AnsiballZ_copy.py'
Jan 23 09:58:22 compute-2 sudo[109214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:22 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:22 compute-2 python3.9[109216]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162301.3898644-896-8143406466741/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:22 compute-2 sudo[109214]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:23 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:23 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:23 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:58:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:23.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:58:23 compute-2 sudo[109366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efummxpbzbtxqyngmwzpxjcchvliwjzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162303.2936113-942-228056593804092/AnsiballZ_stat.py'
Jan 23 09:58:23 compute-2 sudo[109366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:23 compute-2 ceph-mon[75771]: pgmap v145: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 09:58:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:23 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:23 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4780020f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:23 compute-2 python3.9[109368]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:58:23 compute-2 sudo[109366]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:23.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:24 compute-2 sudo[109445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpefkaqbgnvflykrfegpnyqmaadufotk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162303.2936113-942-228056593804092/AnsiballZ_file.py'
Jan 23 09:58:24 compute-2 sudo[109445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:24 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:24 compute-2 python3.9[109447]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:24 compute-2 sudo[109445]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:24 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:24 compute-2 sudo[109598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnmjbyjzttfbrztggjkkeuvkwzevkrae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162304.4991956-978-215322752487370/AnsiballZ_stat.py'
Jan 23 09:58:24 compute-2 sudo[109598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:25 compute-2 python3.9[109600]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:58:25 compute-2 sudo[109598]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:25 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:25 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:25 compute-2 ceph-mon[75771]: pgmap v146: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 09:58:25 compute-2 sudo[109676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itbfmnpmttcrbajusgehqotpahdybvjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162304.4991956-978-215322752487370/AnsiballZ_file.py'
Jan 23 09:58:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:25 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:25 compute-2 sudo[109676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:25 compute-2 python3.9[109678]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:25.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:25 compute-2 sudo[109676]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:25 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:25 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 09:58:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:25 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:25 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:58:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:58:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:25.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:58:26 compute-2 sudo[109829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlnbijajokusegrquhwczlkvxfuilzys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162305.723846-1014-182716984926362/AnsiballZ_stat.py'
Jan 23 09:58:26 compute-2 sudo[109829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:26 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:26 compute-2 python3.9[109831]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:58:26 compute-2 sudo[109829]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:26 compute-2 sudo[109908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbpbobvlfkagphztxuxsiprkftibeknz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162305.723846-1014-182716984926362/AnsiballZ_file.py'
Jan 23 09:58:26 compute-2 sudo[109908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:26 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:26 compute-2 python3.9[109910]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:26 compute-2 sudo[109908]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:27 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:27 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:27 compute-2 sudo[110060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnloymdvrhjtwmcpxbtflfcdqkdwcrvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162307.0275218-1053-245526569493962/AnsiballZ_command.py'
Jan 23 09:58:27 compute-2 sudo[110060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:27 compute-2 ceph-mon[75771]: pgmap v147: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 09:58:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:27 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:27 compute-2 python3.9[110062]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:58:27 compute-2 sudo[110060]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:27.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:27 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:27 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:27.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:28 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:28 compute-2 sudo[110216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-armljinkghdeugisigfiwqhufzzyvnnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162307.7282329-1077-47199193425372/AnsiballZ_blockinfile.py'
Jan 23 09:58:28 compute-2 sudo[110216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:28 compute-2 python3.9[110218]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:28 compute-2 sudo[110216]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:28 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:28 compute-2 sudo[110369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcexvtqepcvnpkocssjfuxxkiojncqog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162308.7142055-1103-114244254687421/AnsiballZ_file.py'
Jan 23 09:58:28 compute-2 sudo[110369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:29 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:29 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4780036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:29 compute-2 python3.9[110371]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:29 compute-2 sudo[110372]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:58:29 compute-2 sudo[110372]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:58:29 compute-2 sudo[110372]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:29 compute-2 sudo[110369]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:29 compute-2 sudo[110397]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 09:58:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:29 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:29 compute-2 sudo[110397]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:58:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:29.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:29 compute-2 sudo[110586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bccyqmmpdbleughdoqxlmxkgvztrimop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162309.371284-1103-60577888511715/AnsiballZ_file.py'
Jan 23 09:58:29 compute-2 sudo[110586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:29 compute-2 ceph-mon[75771]: pgmap v148: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1020 B/s wr, 3 op/s
Jan 23 09:58:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:29 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:29 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:29 compute-2 sudo[110397]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:29 compute-2 python3.9[110590]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:29.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:29 compute-2 sudo[110586]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:30 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:30 compute-2 sudo[110756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqpoawavjdanlopqvnybubqohmhfijsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162310.043694-1149-122226992980583/AnsiballZ_mount.py'
Jan 23 09:58:30 compute-2 sudo[110756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:30 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095830 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 09:58:30 compute-2 python3.9[110758]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 23 09:58:30 compute-2 sudo[110756]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:30 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:58:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:31 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:31 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:31 compute-2 ceph-mon[75771]: pgmap v149: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 935 B/s wr, 2 op/s
Jan 23 09:58:31 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:58:31 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:58:31 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:58:31 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 09:58:31 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:58:31 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:58:31 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 09:58:31 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 09:58:31 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:58:31 compute-2 sudo[110908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muglzubpspmoaddzuhuyyhbvsxcpbelw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162310.910615-1149-234556082980347/AnsiballZ_mount.py'
Jan 23 09:58:31 compute-2 sudo[110908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:31 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4780036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:31 compute-2 python3.9[110910]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 23 09:58:31 compute-2 sudo[110908]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:58:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:31.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:58:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:31 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:31 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:31 compute-2 sshd-session[103571]: Connection closed by 192.168.122.30 port 53474
Jan 23 09:58:31 compute-2 sshd-session[103568]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:58:31 compute-2 systemd[1]: session-43.scope: Deactivated successfully.
Jan 23 09:58:31 compute-2 systemd[1]: session-43.scope: Consumed 30.036s CPU time.
Jan 23 09:58:31 compute-2 systemd-logind[786]: Session 43 logged out. Waiting for processes to exit.
Jan 23 09:58:31 compute-2 systemd-logind[786]: Removed session 43.
Jan 23 09:58:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:31.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:32 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:32 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:33 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:33 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:33 compute-2 ceph-mon[75771]: pgmap v150: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 935 B/s wr, 2 op/s
Jan 23 09:58:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:33 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:33.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:33 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:33 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4780043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:33.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:34 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:34 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:35 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:35 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4780043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:35 compute-2 ceph-mon[75771]: pgmap v151: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 425 B/s wr, 1 op/s
Jan 23 09:58:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:58:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:35 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:35.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:35 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:35 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:35 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:58:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:35.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:36 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:36 compute-2 sudo[110940]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 09:58:36 compute-2 sudo[110940]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:58:36 compute-2 sudo[110940]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:36 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:36 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:58:36 compute-2 ceph-mon[75771]: pgmap v152: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 425 B/s wr, 1 op/s
Jan 23 09:58:36 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:58:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:37 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:37 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:37 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:37.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:37 compute-2 sshd-session[110966]: Accepted publickey for zuul from 192.168.122.30 port 39302 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:58:37 compute-2 systemd-logind[786]: New session 44 of user zuul.
Jan 23 09:58:37 compute-2 systemd[1]: Started Session 44 of User zuul.
Jan 23 09:58:37 compute-2 sshd-session[110966]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:58:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:37 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:37 compute-2 sudo[110970]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:58:37 compute-2 sudo[110970]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:58:37 compute-2 sudo[110970]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:37 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:37.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:38 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:38 compute-2 sudo[111145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itlwwwubbdjuvmokepleofxuhyjigiwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162317.7140548-20-259174826589223/AnsiballZ_tempfile.py'
Jan 23 09:58:38 compute-2 sudo[111145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:38 compute-2 python3.9[111147]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 23 09:58:38 compute-2 sudo[111145]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:38 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:39 compute-2 sudo[111298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cipuxmhvrupbbgvwdbnqysserzcttwxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162318.6295025-56-252269384524197/AnsiballZ_stat.py'
Jan 23 09:58:39 compute-2 sudo[111298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:39 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:39 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4780043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:39 compute-2 python3.9[111300]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:58:39 compute-2 ceph-mon[75771]: pgmap v153: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 425 B/s wr, 1 op/s
Jan 23 09:58:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:39 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:39 compute-2 sudo[111298]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:39.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:39 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:39 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:39.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:39 compute-2 sudo[111453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afhpiwhuuznkhlxdotysxqkutszetrpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162319.5042028-80-2865089779064/AnsiballZ_slurp.py'
Jan 23 09:58:39 compute-2 sudo[111453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:40 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:40 compute-2 python3.9[111455]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Jan 23 09:58:40 compute-2 sudo[111453]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:40 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:40 compute-2 sudo[111606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwaixtznbrplnufipgeqqnmxkffyiaqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162320.3867793-105-111277945519754/AnsiballZ_stat.py'
Jan 23 09:58:40 compute-2 sudo[111606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:40 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:58:40 compute-2 python3.9[111608]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.cgk9a7px follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:58:40 compute-2 sudo[111606]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:41 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:41 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:41 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4780043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:41 compute-2 ceph-mon[75771]: pgmap v154: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 09:58:41 compute-2 sudo[111734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgvphibzrdwithkozwqxrukjolrunnzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162320.3867793-105-111277945519754/AnsiballZ_copy.py'
Jan 23 09:58:41 compute-2 sudo[111734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:41.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:41 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:41 compute-2 python3.9[111736]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.cgk9a7px mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162320.3867793-105-111277945519754/.source.cgk9a7px _original_basename=.nttzwjo3 follow=False checksum=6c63675b4fda7e0d01c328fcbe34dc890491aeeb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:41 compute-2 sudo[111734]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:41 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:58:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:41.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:58:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:42 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:42 compute-2 sudo[111888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrbrtlavxexsrlwciffcctsqamaumfmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162321.8899918-149-217770147095032/AnsiballZ_setup.py'
Jan 23 09:58:42 compute-2 sudo[111888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:42 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:42 compute-2 ceph-mon[75771]: pgmap v155: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 23 09:58:42 compute-2 python3.9[111890]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:58:42 compute-2 sudo[111888]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:43 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:43 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484002ad0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:43 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:43 compute-2 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 23 09:58:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:58:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:43.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:58:43 compute-2 sudo[112043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cctbtkqberuanksmyawhdsxpwjdvjusm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162323.1736188-174-24603343726804/AnsiballZ_blockinfile.py'
Jan 23 09:58:43 compute-2 sudo[112043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:43 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:43 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:43 compute-2 python3.9[112045]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC+cj2so8SS29oYZ1K+7e02qi6fVkGXJzGMkIN9mgJPLCBtQ6vpBYEObTZZXuMIHhdiMUAp6RDjs11OXDkAB9R7e2ncjMKn7J2EHbmceT7rNq9L0w+QaLKFxl+xdJQ9QtO9ioNgJFXXQZt/IOeE8S4I5yhEM5jn+YEW0LPbp99Wz1d1Ob4GI1t0hCEv/4ayC3nRIXkuIhl7mrV0s22F8NE8f0hZZKaw1u8xmmpbD8ZVBsC6cxWE3kIQBmHu8q9tylaZjLsjGxBDUF9ko3bxeppvLPDMem89VLQCWbgmOHl5ZIPsyNglusTIBUp8uA7g+Agz1uMojClMHnsZl68WjbCAVcRA9y/UgXphGyEYZCUJMv8CjYKzxriyHALZl6YFSyC5ELlEAxL8fyTwtXhQ1+e/lI9Ak3n4suC6JyH0NQ27MPIf7riyUFJLw9lZaDerZOkvI7/Y2PfRvdfyZ57g/xgGeLY0Ch30SFVC04lNXIpsOWbLBOg0BMP9ZiciAYAF9Yc=
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIreWuVcekgp7kF5pU+4TIKLHZyhuqd4Ly312ExEA5EG
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJWfXOTsTXqDhdGhW7VcUXsYqCS7TzCPyaa9/dA9e0xKjnni1/GRM8FdYXWYbGsNnBQFWk3/pXD6sj3jKzK34AM=
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDA/6JnQZ3CFC7xgv4DrvdZizVbVnsolKcWkvqzGu1hFHGmOEb7ehbxGPHBnp2N9iRf13H12EI0qNI6A2f44V0oXE3SP+fpJ6PVYQRQpKqTEiweqZaHEyYE2FnKy0HDQisg5hwr1egYLjGXChdkyqWSokL1LqaCyD2+EcOzUvC/GuVQ7eQnQBIGBpYAnNzS/64KKOZ0+0soOPJGxVCma6JN/2GcCunX6j3HmkOOQeuEFETXfUPHh1ylu2+3yINl34ERJN5YwgR/S+BKENOsJTu5XkYTCvc90CuvfkoF9K5Y2yE5nKwZaSf7n2SbUPil2Zph4l7opsd5IKxi6k2mVzw/CO2NHr136BZ06+sKXytDgorWqWzqnci8zfxeYF3D7q7AXD+IDVMP5T6op93oS2enAQFHG1vTLB0otQqnxUgNANbJkrKgXAS8G8I1m2sPz+qOFuuZa2/nqhzrd6/DEur5VoW6n9c/OcrbfapLEzD1jQDmsQI7oZkT++dt3Ogb3Vk=
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIII1sLqY7Nqi1A3CKXLokfn1vrns/lK1gUkDNSlbek2o
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM9QZXHUsthFMKA5Si4Htl7MIwK0G4VAltQgbo39JJHrgD7h27U1jbnuJQ1S2bBX8FMSkqf5TPmM7Gr9QOATO+4=
                                             compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDWbrXZxuAw0n/xJmOvWW/Qbg53ya2CuJKzcHA+OvDpHLHGxkEuiUhwKvqUbfSTzn0o1M00OYITJIvZVINGRtQC7hGvBPWLVBON097mcmnju857I72U3dGdvGhnEUHyrglCV+xSkafQTTlnY9B59EKImUs/kiwRy3cYDWkCgthJgiPA4QSw6WrzaqpY2ET+7n+yY31EOagGA3ufW43qFbHX4diFuXpS1I1PLvvA4KINlMlsFcyR29j4nQk/vb5hMpLmBOlfVH16CXZC98a0ltp9ib7F3e1Wjdogj92kxwfQMYIeQEBp11Tc/PY5U90J51oyk8xYOKfsP3+r9yczmfRDjwR3+tzUMKyZYAsKQVcOGQC7x9sEXg3mBeXRVrlIVZFMuNVcYq4CY40fDIybcI25GxgRbQR7ZUWODG1SL7RF02Z+LQB6APXkzxdQUWLWPryj/EtOgnHQ1I0+BJTWrqGkKbSj41jhRTfS+MZvRXAJ+fNyZFhpkHo54DrCii4cbyM=
                                             compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGRPkwTcFVg/dIKRq29iWBfkoVFqIQ1pXOCPxfcGWRFF
                                             compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGf/hJ2dg/PRwojw63FLyKqua+ChKP+2bc7Eb0p70H6ve1elFVeY8lVRXx33JWc2m/XfgSWPNcUs9zBG8QcFVak=
                                              create=True mode=0644 path=/tmp/ansible.cgk9a7px state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:58:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:43.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:58:43 compute-2 sudo[112043]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:44 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:44 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:44 compute-2 sudo[112197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhddznnduiiqvskbnlhxneinsctkfojc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162324.4042234-198-132367103869182/AnsiballZ_command.py'
Jan 23 09:58:44 compute-2 sudo[112197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:45 compute-2 python3.9[112199]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.cgk9a7px' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:58:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:45 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:45 compute-2 sudo[112197]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:45 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:45 compute-2 ceph-mon[75771]: pgmap v156: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:58:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:45 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4780043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:58:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:45.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:58:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:45 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:45 compute-2 sudo[112351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddzkmarcxprhvvmhfcduqerqvfmtpvtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162325.326787-222-264932583379806/AnsiballZ_file.py'
Jan 23 09:58:45 compute-2 sudo[112351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:45 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484002ad0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:45 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:58:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:45.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:45 compute-2 python3.9[112353]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.cgk9a7px state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:46 compute-2 sudo[112351]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:46 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:46 compute-2 sshd-session[110969]: Connection closed by 192.168.122.30 port 39302
Jan 23 09:58:46 compute-2 sshd-session[110966]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:58:46 compute-2 systemd[1]: session-44.scope: Deactivated successfully.
Jan 23 09:58:46 compute-2 systemd[1]: session-44.scope: Consumed 5.826s CPU time.
Jan 23 09:58:46 compute-2 systemd-logind[786]: Session 44 logged out. Waiting for processes to exit.
Jan 23 09:58:46 compute-2 systemd-logind[786]: Removed session 44.
Jan 23 09:58:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:46 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:47 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:47 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:47 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:58:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:47.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:58:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:47 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:47 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:58:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:47.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:58:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:48 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:48 compute-2 ceph-mon[75771]: pgmap v157: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:58:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:48 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:49 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:49 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:49 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484002ad0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.002000051s ======
Jan 23 09:58:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:49.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000051s
Jan 23 09:58:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:49 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:49 compute-2 ceph-mon[75771]: pgmap v158: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 09:58:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:49 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4780043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:58:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:49.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:58:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:50 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:58:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:50 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:50 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:58:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:51 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:51 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494001080 fd 38 proxy ignored for local
Jan 23 09:58:51 compute-2 kernel: ganesha.nfsd[111658]: segfault at 50 ip 00007fd51fbef32e sp 00007fd4a8ff8210 error 4 in libntirpc.so.5.8[7fd51fbd4000+2c000] likely on CPU 1 (core 0, socket 1)
Jan 23 09:58:51 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 09:58:51 compute-2 systemd[1]: Started Process Core Dump (PID 112384/UID 0).
Jan 23 09:58:51 compute-2 ceph-mon[75771]: pgmap v159: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:58:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:58:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:51.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:58:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:51 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:58:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:51.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:58:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:52 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:52 compute-2 ceph-mon[75771]: pgmap v160: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 09:58:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:52 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:52 compute-2 systemd-coredump[112385]: Process 98321 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 60:
                                                    #0  0x00007fd51fbef32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Jan 23 09:58:53 compute-2 systemd[1]: systemd-coredump@2-112384-0.service: Deactivated successfully.
Jan 23 09:58:53 compute-2 systemd[1]: systemd-coredump@2-112384-0.service: Consumed 1.649s CPU time.
Jan 23 09:58:53 compute-2 podman[112392]: 2026-01-23 09:58:53.095259898 +0000 UTC m=+0.035250895 container died d693275e105c73aef6f04d631fc637ff536d8a3b7f8c2d079dfe0bd3e5450fb1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.build-date=20250325)
Jan 23 09:58:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:53 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:53 compute-2 systemd[1]: var-lib-containers-storage-overlay-b19ca06aee9ccd666281d5d22cdbe211ab7c13b8da2e9314aa375f7fd13de7bf-merged.mount: Deactivated successfully.
Jan 23 09:58:53 compute-2 podman[112392]: 2026-01-23 09:58:53.273178711 +0000 UTC m=+0.213169688 container remove d693275e105c73aef6f04d631fc637ff536d8a3b7f8c2d079dfe0bd3e5450fb1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 09:58:53 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Main process exited, code=exited, status=139/n/a
Jan 23 09:58:53 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 09:58:53 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 2.128s CPU time.
Jan 23 09:58:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:53.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:53 compute-2 sshd-session[112435]: Accepted publickey for zuul from 192.168.122.30 port 55676 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:58:53 compute-2 systemd-logind[786]: New session 45 of user zuul.
Jan 23 09:58:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:53 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:53 compute-2 systemd[1]: Started Session 45 of User zuul.
Jan 23 09:58:53 compute-2 sshd-session[112435]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:58:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:58:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:53.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:58:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:54 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:54 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:54 compute-2 python3.9[112590]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:58:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:55 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:55 compute-2 ceph-mon[75771]: pgmap v161: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:58:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:58:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:55.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:58:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:55 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:55 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:58:55 compute-2 sudo[112744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsastrzhmasirorkzfjxvqxjzbvdbrli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162335.2334704-53-242978587557528/AnsiballZ_systemd.py'
Jan 23 09:58:55 compute-2 sudo[112744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:58:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:55.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:58:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:56 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:56 compute-2 python3.9[112746]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 23 09:58:56 compute-2 sudo[112744]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:56 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:56 compute-2 sudo[112900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfykkdklyjnokrlpmbrqmsvtsnlkhbun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162336.5172942-77-258324303363545/AnsiballZ_systemd.py'
Jan 23 09:58:56 compute-2 sudo[112900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:57 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:57 compute-2 python3.9[112902]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:58:57 compute-2 sudo[112900]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095857 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 09:58:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:58:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:57.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:58:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:57 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:57 compute-2 sudo[113003]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:58:57 compute-2 sudo[113003]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:58:57 compute-2 sudo[113003]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:57 compute-2 ceph-mon[75771]: pgmap v162: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:58:57 compute-2 sudo[113079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogrlruubsfsoxddhpsprisrdmopsbbyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162337.4958332-104-205024805400578/AnsiballZ_command.py'
Jan 23 09:58:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:58:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:57.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:58:57 compute-2 sudo[113079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:58 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:58 compute-2 python3.9[113081]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:58:58 compute-2 sudo[113079]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:58 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:58 compute-2 sudo[113233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozfjbsvqmzcoquzofkqlfdwbsfqbyxct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162338.3392894-128-176114183510570/AnsiballZ_stat.py'
Jan 23 09:58:58 compute-2 sudo[113233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:58 compute-2 python3.9[113235]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:58:59 compute-2 sudo[113233]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:59 compute-2 ceph-mon[75771]: pgmap v163: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 09:58:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:59 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:58:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:59.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:58:59 compute-2 sudo[113385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igtkdyqvvovyxejoqktpvlzjsotbbrxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162339.2231834-155-32304325565859/AnsiballZ_file.py'
Jan 23 09:58:59 compute-2 sudo[113385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:59 2026: (VI_0) received an invalid passwd!
Jan 23 09:58:59 compute-2 python3.9[113387]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:59 compute-2 sudo[113385]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:58:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:59.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:00 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:00 compute-2 sshd-session[112438]: Connection closed by 192.168.122.30 port 55676
Jan 23 09:59:00 compute-2 sshd-session[112435]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:59:00 compute-2 systemd[1]: session-45.scope: Deactivated successfully.
Jan 23 09:59:00 compute-2 systemd[1]: session-45.scope: Consumed 4.300s CPU time.
Jan 23 09:59:00 compute-2 systemd-logind[786]: Session 45 logged out. Waiting for processes to exit.
Jan 23 09:59:00 compute-2 systemd-logind[786]: Removed session 45.
Jan 23 09:59:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:00 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:00 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:59:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:01 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:01 compute-2 ceph-mon[75771]: pgmap v164: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 09:59:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:59:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:01.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:59:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:01 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:59:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:01.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:59:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:02 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:02 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:03 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:03 compute-2 ceph-mon[75771]: pgmap v165: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:59:03 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Scheduled restart job, restart counter is at 3.
Jan 23 09:59:03 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:59:03 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 2.128s CPU time.
Jan 23 09:59:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:59:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:03.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:59:03 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:59:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:03 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:03 compute-2 podman[113462]: 2026-01-23 09:59:03.774987317 +0000 UTC m=+0.043076780 container create a5a5cf8558c88760fb11f4b695f5b085f0ad0a5e0be971312152ed9ff3df32c8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 09:59:03 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2582de70091ee01944df38385b4c144b2e2a6dee2eeb4da56efe2aee3d46bad/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 09:59:03 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2582de70091ee01944df38385b4c144b2e2a6dee2eeb4da56efe2aee3d46bad/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 09:59:03 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2582de70091ee01944df38385b4c144b2e2a6dee2eeb4da56efe2aee3d46bad/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:59:03 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2582de70091ee01944df38385b4c144b2e2a6dee2eeb4da56efe2aee3d46bad/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.tykohi-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 09:59:03 compute-2 podman[113462]: 2026-01-23 09:59:03.756076401 +0000 UTC m=+0.024165884 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:59:03 compute-2 podman[113462]: 2026-01-23 09:59:03.852926509 +0000 UTC m=+0.121016002 container init a5a5cf8558c88760fb11f4b695f5b085f0ad0a5e0be971312152ed9ff3df32c8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Jan 23 09:59:03 compute-2 podman[113462]: 2026-01-23 09:59:03.858571927 +0000 UTC m=+0.126661390 container start a5a5cf8558c88760fb11f4b695f5b085f0ad0a5e0be971312152ed9ff3df32c8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Jan 23 09:59:03 compute-2 bash[113462]: a5a5cf8558c88760fb11f4b695f5b085f0ad0a5e0be971312152ed9ff3df32c8
Jan 23 09:59:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:03 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 09:59:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:03 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 09:59:03 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:59:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:03.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:03 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 09:59:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:03 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 09:59:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:03 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 09:59:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:03 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 09:59:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:03 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 09:59:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:03 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:59:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:04 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:04 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:05 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:05 compute-2 ceph-mon[75771]: pgmap v166: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 09:59:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:59:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:59:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:05.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:59:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:05 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:05 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:59:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:05.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:06 compute-2 sshd-session[113522]: Accepted publickey for zuul from 192.168.122.30 port 46994 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:59:06 compute-2 systemd-logind[786]: New session 46 of user zuul.
Jan 23 09:59:06 compute-2 systemd[1]: Started Session 46 of User zuul.
Jan 23 09:59:06 compute-2 sshd-session[113522]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:59:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:06 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:06 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:07 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:07 compute-2 python3.9[113676]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:59:07 compute-2 ceph-mon[75771]: pgmap v167: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 09:59:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:07.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:07 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:59:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:07.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:59:08 compute-2 sudo[113831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxwnckkbivpekwdatpjkhavaifuhwxpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162347.7149043-59-275506700867768/AnsiballZ_setup.py'
Jan 23 09:59:08 compute-2 sudo[113831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:08 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:08 compute-2 python3.9[113833]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:59:08 compute-2 ceph-mon[75771]: pgmap v168: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:59:08 compute-2 sudo[113831]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:08 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:08 compute-2 sudo[113916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhijmylpdmcxbtabqsrgaqfuqtoabwqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162347.7149043-59-275506700867768/AnsiballZ_dnf.py'
Jan 23 09:59:08 compute-2 sudo[113916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:09 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:09 compute-2 python3.9[113918]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 09:59:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:09.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:09 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:59:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:09.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:59:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:10 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:10 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 09:59:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:10 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:59:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:10 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:10 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:59:10 compute-2 sudo[113916]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:11 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:11 compute-2 ceph-mon[75771]: pgmap v169: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:59:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:59:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:11.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:59:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:11 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:11 compute-2 python3.9[114071]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:59:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:11.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:12 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:12 compute-2 ceph-mon[75771]: pgmap v170: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.6 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 09:59:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:12 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:13 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:13 compute-2 python3.9[114224]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 09:59:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:13 compute-2 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 09:59:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:13.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:13 compute-2 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 09:59:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:13 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:59:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:13.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:59:14 compute-2 python3.9[114375]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:59:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:14 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:14 compute-2 python3.9[114527]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:59:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:14 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:15 compute-2 ceph-mon[75771]: pgmap v171: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 09:59:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:15 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:15 compute-2 sshd-session[113525]: Connection closed by 192.168.122.30 port 46994
Jan 23 09:59:15 compute-2 sshd-session[113522]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:59:15 compute-2 systemd-logind[786]: Session 46 logged out. Waiting for processes to exit.
Jan 23 09:59:15 compute-2 systemd[1]: session-46.scope: Deactivated successfully.
Jan 23 09:59:15 compute-2 systemd[1]: session-46.scope: Consumed 6.502s CPU time.
Jan 23 09:59:15 compute-2 systemd-logind[786]: Removed session 46.
Jan 23 09:59:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:59:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:15.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:59:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:15 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:15 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:59:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:15.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:16 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:16 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 09:59:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 09:59:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:17 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:17 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36d0000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:17 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36c40016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:17 compute-2 ceph-mon[75771]: pgmap v172: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 09:59:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:17.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:17 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:17 compute-2 sudo[114568]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:59:17 compute-2 sudo[114568]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:59:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:17 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36b8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:17 compute-2 sudo[114568]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:17.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:18 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:18 compute-2 ceph-mon[75771]: pgmap v173: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 09:59:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:18 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:19 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:19 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36d0000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095919 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 09:59:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:19 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36b4000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:19.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:19 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:19 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36c4002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:19.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:20 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:59:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:20 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:20 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:59:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:21 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:21 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36b4000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:21 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36b80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:21.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:21 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:21 compute-2 ceph-mon[75771]: pgmap v174: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 09:59:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:21 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36d00021f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:21.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:22 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:22 compute-2 sshd-session[114599]: Accepted publickey for zuul from 192.168.122.30 port 37162 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:59:22 compute-2 systemd-logind[786]: New session 47 of user zuul.
Jan 23 09:59:22 compute-2 systemd[1]: Started Session 47 of User zuul.
Jan 23 09:59:22 compute-2 sshd-session[114599]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:59:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:22 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:22 compute-2 ceph-mon[75771]: pgmap v175: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 09:59:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:23 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:23 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36d00021f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:23 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36b4001cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:23 compute-2 python3.9[114752]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:59:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:23.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:23 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:23 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36b80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:59:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:23.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:59:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:24 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:24 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:25 compute-2 sudo[114908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcebmsomcydsyqnkzldfygvhtkwgpqtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162364.630263-108-210632908405962/AnsiballZ_file.py'
Jan 23 09:59:25 compute-2 sudo[114908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:25 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:25 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36c4002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:25 compute-2 python3.9[114910]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:59:25 compute-2 sudo[114908]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:25 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36d0002390 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:25 compute-2 ceph-mon[75771]: pgmap v176: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:59:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 09:59:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:25.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 09:59:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:25 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:25 compute-2 sudo[115060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ualzstbpidnsajkeigaoyqpjcmzeoukx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162365.4471283-108-269772001534799/AnsiballZ_file.py'
Jan 23 09:59:25 compute-2 sudo[115060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:25 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:59:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:25 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36b4001cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:25 compute-2 python3.9[115062]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:59:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:59:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:25.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:59:25 compute-2 sudo[115060]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:26 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:26 compute-2 sudo[115214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdskbywrhemeisayqdfcfvecvjsogdnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162366.1671932-153-211043007712582/AnsiballZ_stat.py'
Jan 23 09:59:26 compute-2 sudo[115214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:26 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:26 compute-2 ceph-mon[75771]: pgmap v177: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:59:26 compute-2 python3.9[115216]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:26 compute-2 sudo[115214]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:27 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:27 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36b80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:27 compute-2 sudo[115337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skelirutofpzretpgijqapxrwageuqco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162366.1671932-153-211043007712582/AnsiballZ_copy.py'
Jan 23 09:59:27 compute-2 sudo[115337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:27 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36b4001cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:27 compute-2 python3.9[115339]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162366.1671932-153-211043007712582/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=5f463a1334205a2aad5395f81514ee931215e9c5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:27 compute-2 sudo[115337]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:27.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:27 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:27 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36c4002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:27 compute-2 sudo[115490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgsvqtyhnadkwpyfdfseciqgbbvnhevc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162367.6790667-153-217915155126637/AnsiballZ_stat.py'
Jan 23 09:59:27 compute-2 sudo[115490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:59:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:27.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:59:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:28 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:28 compute-2 python3.9[115492]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:28 compute-2 sudo[115490]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:28 compute-2 sudo[115614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drmgzobwwltspkroyqdvlxhmojzakbje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162367.6790667-153-217915155126637/AnsiballZ_copy.py'
Jan 23 09:59:28 compute-2 sudo[115614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:28 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:28 compute-2 python3.9[115616]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162367.6790667-153-217915155126637/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=ff17d6d1438a69ae92e7570d79b66fb807ae4885 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:28 compute-2 sudo[115614]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:29 compute-2 ceph-mon[75771]: pgmap v178: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:59:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:29 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:29 compute-2 sudo[115766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxmhczifzslzfbfhezewfwxuchcugann ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162368.8758805-153-249246164828380/AnsiballZ_stat.py'
Jan 23 09:59:29 compute-2 sudo[115766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:29 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36d00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:29 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36b8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:29 compute-2 python3.9[115768]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:29 compute-2 sudo[115766]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:59:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:29.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:59:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:29 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:29 compute-2 sudo[115889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fayzzvytsjuagsuqodcqfxdkdaprdlje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162368.8758805-153-249246164828380/AnsiballZ_copy.py'
Jan 23 09:59:29 compute-2 sudo[115889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:29 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36b8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:29 compute-2 python3.9[115891]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162368.8758805-153-249246164828380/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=74ae1654d12c23c4d6b67ccf19cdb7558a450192 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:29 compute-2 sudo[115889]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:59:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:29.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:59:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:30 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.349027) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162370349244, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 1159, "num_deletes": 251, "total_data_size": 2886732, "memory_usage": 2912480, "flush_reason": "Manual Compaction"}
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162370364957, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 1881107, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 11950, "largest_seqno": 13103, "table_properties": {"data_size": 1876037, "index_size": 2594, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 10506, "raw_average_key_size": 19, "raw_value_size": 1865914, "raw_average_value_size": 3386, "num_data_blocks": 116, "num_entries": 551, "num_filter_entries": 551, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162273, "oldest_key_time": 1769162273, "file_creation_time": 1769162370, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 16046 microseconds, and 7197 cpu microseconds.
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.365125) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 1881107 bytes OK
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.365170) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.367702) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.367752) EVENT_LOG_v1 {"time_micros": 1769162370367745, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.367783) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 2881188, prev total WAL file size 2881188, number of live WAL files 2.
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.369035) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(1837KB)], [24(12MB)]
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162370369200, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 15336228, "oldest_snapshot_seqno": -1}
Jan 23 09:59:30 compute-2 sudo[116043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqrztwinrbtkutbezzjuxdwbrrkmllcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162370.181035-282-255507860595184/AnsiballZ_file.py'
Jan 23 09:59:30 compute-2 sudo[116043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:30 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:30 compute-2 python3.9[116045]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:59:30 compute-2 sudo[116043]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:30 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4279 keys, 13223563 bytes, temperature: kUnknown
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162370877589, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 13223563, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13192084, "index_size": 19657, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10757, "raw_key_size": 109428, "raw_average_key_size": 25, "raw_value_size": 13110978, "raw_average_value_size": 3064, "num_data_blocks": 828, "num_entries": 4279, "num_filter_entries": 4279, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769162370, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.974956) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 13223563 bytes
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.978468) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 30.2 rd, 26.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.8, 12.8 +0.0 blob) out(12.6 +0.0 blob), read-write-amplify(15.2) write-amplify(7.0) OK, records in: 4795, records dropped: 516 output_compression: NoCompression
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.978515) EVENT_LOG_v1 {"time_micros": 1769162370978498, "job": 12, "event": "compaction_finished", "compaction_time_micros": 508562, "compaction_time_cpu_micros": 45799, "output_level": 6, "num_output_files": 1, "total_output_size": 13223563, "num_input_records": 4795, "num_output_records": 4279, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162370979246, "job": 12, "event": "table_file_deletion", "file_number": 26}
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162370982482, "job": 12, "event": "table_file_deletion", "file_number": 24}
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.368883) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.982591) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.982599) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.982601) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.982603) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:59:30 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.982604) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:59:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:31 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:31 compute-2 sudo[116195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtgaegkvjgcqdycxvestcvyxssiiaslk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162370.8598623-282-161137509502654/AnsiballZ_file.py'
Jan 23 09:59:31 compute-2 sudo[116195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:31 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36c4002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:31 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36d0009740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:31 compute-2 ceph-mon[75771]: pgmap v179: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 09:59:31 compute-2 python3.9[116197]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:59:31 compute-2 sudo[116195]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:59:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:31.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:59:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:31 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:31 compute-2 sudo[116347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehytookawibwypcxekfacsjucsbfheer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162371.590063-325-153310206666673/AnsiballZ_stat.py'
Jan 23 09:59:31 compute-2 sudo[116347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:31 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36b8002b10 fd 38 proxy ignored for local
Jan 23 09:59:31 compute-2 kernel: ganesha.nfsd[114567]: segfault at 50 ip 00007f3759e1a32e sp 00007f36c2ffc210 error 4 in libntirpc.so.5.8[7f3759dff000+2c000] likely on CPU 0 (core 0, socket 0)
Jan 23 09:59:31 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 09:59:31 compute-2 systemd[1]: Started Process Core Dump (PID 116351/UID 0).
Jan 23 09:59:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:31.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:32 compute-2 python3.9[116349]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:32 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:32 compute-2 sudo[116347]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:32 compute-2 sudo[116474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivvhepkfzvoejcnfahdmatdonsbewmbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162371.590063-325-153310206666673/AnsiballZ_copy.py'
Jan 23 09:59:32 compute-2 sudo[116474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:32 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:32 compute-2 python3.9[116476]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162371.590063-325-153310206666673/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=d3518b087b935787ae8459844310cc45ab489248 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:32 compute-2 sudo[116474]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:33 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:33 compute-2 sudo[116626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otnkksjureluytexypbcliviwebyqxha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162372.8841522-325-34392993972033/AnsiballZ_stat.py'
Jan 23 09:59:33 compute-2 sudo[116626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:33 compute-2 systemd-coredump[116352]: Process 113481 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 53:
                                                    #0  0x00007f3759e1a32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    #1  0x0000000000000000 n/a (n/a + 0x0)
                                                    #2  0x00007f3759e24900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)
                                                    ELF object binary architecture: AMD x86-64
Jan 23 09:59:33 compute-2 python3.9[116628]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:33 compute-2 sudo[116626]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:33 compute-2 systemd[1]: systemd-coredump@3-116351-0.service: Deactivated successfully.
Jan 23 09:59:33 compute-2 systemd[1]: systemd-coredump@3-116351-0.service: Consumed 1.400s CPU time.
Jan 23 09:59:33 compute-2 podman[116641]: 2026-01-23 09:59:33.481914055 +0000 UTC m=+0.033789712 container died a5a5cf8558c88760fb11f4b695f5b085f0ad0a5e0be971312152ed9ff3df32c8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Jan 23 09:59:33 compute-2 systemd[1]: var-lib-containers-storage-overlay-a2582de70091ee01944df38385b4c144b2e2a6dee2eeb4da56efe2aee3d46bad-merged.mount: Deactivated successfully.
Jan 23 09:59:33 compute-2 podman[116641]: 2026-01-23 09:59:33.526123435 +0000 UTC m=+0.077999072 container remove a5a5cf8558c88760fb11f4b695f5b085f0ad0a5e0be971312152ed9ff3df32c8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.license=GPLv2)
Jan 23 09:59:33 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Main process exited, code=exited, status=139/n/a
Jan 23 09:59:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:33.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:33 compute-2 ceph-mon[75771]: pgmap v180: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 23 09:59:33 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 09:59:33 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.583s CPU time.
Jan 23 09:59:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:33 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:33 compute-2 sudo[116796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scqivxqjggzmvvhfyvuyvxicnddofboc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162372.8841522-325-34392993972033/AnsiballZ_copy.py'
Jan 23 09:59:33 compute-2 sudo[116796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:33 compute-2 python3.9[116798]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162372.8841522-325-34392993972033/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=7ea5769d722c11e7459792c631f886a53fdd1360 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:33 compute-2 sudo[116796]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:59:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:33.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:59:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:34 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:34 compute-2 sudo[116950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttdztlbramnhiygbgcdxmdsxcwmkahtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162374.1205697-325-112555519153618/AnsiballZ_stat.py'
Jan 23 09:59:34 compute-2 sudo[116950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:34 compute-2 python3.9[116952]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:34 compute-2 sudo[116950]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:34 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:34 compute-2 sudo[117073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfiyrlzwbirkzdqadxlmmjyqzebpstfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162374.1205697-325-112555519153618/AnsiballZ_copy.py'
Jan 23 09:59:34 compute-2 sudo[117073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:35 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:35 compute-2 python3.9[117075]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162374.1205697-325-112555519153618/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=c7b2cc1434b948bda234b68388cbd799abca388a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:35 compute-2 sudo[117073]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:35 compute-2 ceph-mon[75771]: pgmap v181: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:59:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:35.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:35 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:35 compute-2 sudo[117225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrpylyvjiougyyoakllhbbuatphonlzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162375.466436-460-143206879698239/AnsiballZ_file.py'
Jan 23 09:59:35 compute-2 sudo[117225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:35 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:59:35 compute-2 python3.9[117227]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:59:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:59:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:35.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:59:36 compute-2 sudo[117225]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:36 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:36 compute-2 sudo[117326]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:59:36 compute-2 sudo[117326]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:59:36 compute-2 sudo[117326]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:36 compute-2 sudo[117367]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 09:59:36 compute-2 sudo[117367]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:59:36 compute-2 sudo[117429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jeyspimliurizmxglmntkscwpjsanttt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162376.153543-460-80345232067098/AnsiballZ_file.py'
Jan 23 09:59:36 compute-2 sudo[117429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:36 compute-2 python3.9[117431]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:59:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:36 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:36 compute-2 sudo[117429]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:36 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:59:36 compute-2 ceph-mon[75771]: pgmap v182: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:59:36 compute-2 sudo[117367]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:37 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:37 compute-2 sudo[117613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htnadnfbjycavfetcjqtequlmcfalefd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162376.8699226-503-40811081212946/AnsiballZ_stat.py'
Jan 23 09:59:37 compute-2 sudo[117613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:37 compute-2 python3.9[117615]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:37 compute-2 sudo[117613]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:37.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:37 compute-2 sudo[117736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqwfztvzkjvmspylbdjymfewnxbfuzxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162376.8699226-503-40811081212946/AnsiballZ_copy.py'
Jan 23 09:59:37 compute-2 sudo[117736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:37 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:37 compute-2 python3.9[117738]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162376.8699226-503-40811081212946/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=377da3c129b85449f0af58d2fb6b8163dbdb149d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:37 compute-2 sudo[117740]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:59:37 compute-2 sudo[117740]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:59:37 compute-2 sudo[117740]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:37 compute-2 sudo[117736]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:37.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:38 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:38 compute-2 sudo[117915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdujsztxcemjclluxfrzkdztbmcookbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162378.0618517-503-158348946627568/AnsiballZ_stat.py'
Jan 23 09:59:38 compute-2 sudo[117915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:38 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:59:38 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:59:38 compute-2 python3.9[117917]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:38 compute-2 sudo[117915]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:38 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:38 compute-2 sudo[118038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfqyikysrbbcwvlmklcqgjnqyqcvfvnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162378.0618517-503-158348946627568/AnsiballZ_copy.py'
Jan 23 09:59:38 compute-2 sudo[118038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:39 compute-2 python3.9[118040]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162378.0618517-503-158348946627568/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=7ea5769d722c11e7459792c631f886a53fdd1360 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:39 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:39 compute-2 sudo[118038]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095939 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 09:59:39 compute-2 sudo[118190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-longbxqsymdbscxfdkwejuuzvluwmwnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162379.2646806-503-23478166857840/AnsiballZ_stat.py'
Jan 23 09:59:39 compute-2 sudo[118190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:59:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:39.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:59:39 compute-2 ceph-mon[75771]: pgmap v183: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 09:59:39 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:59:39 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 09:59:39 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:59:39 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:59:39 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 09:59:39 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 09:59:39 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:59:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:39 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:39 compute-2 python3.9[118192]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:39 compute-2 sudo[118190]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:39.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:40 compute-2 sudo[118314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prfvxumqgytnkkedqxgzvyzpzixdkpdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162379.2646806-503-23478166857840/AnsiballZ_copy.py'
Jan 23 09:59:40 compute-2 sudo[118314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:40 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:40 compute-2 python3.9[118316]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162379.2646806-503-23478166857840/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=d8c8d38f928b275e03f3fce0093c5c40b17e4fa7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:40 compute-2 sudo[118314]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:40 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:40 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:59:40 compute-2 ceph-mon[75771]: pgmap v184: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:59:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:41 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:41 compute-2 sudo[118467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwaqervzlugupztqzzvuvucnojrdehny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162381.17934-672-89243613740787/AnsiballZ_file.py'
Jan 23 09:59:41 compute-2 sudo[118467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:59:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:41.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:59:41 compute-2 python3.9[118469]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:59:41 compute-2 sudo[118467]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:41 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:41.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:42 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:42 compute-2 sudo[118620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fayslavjgjyolbfxdcvdnulycooicutx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162381.8422322-703-87408833851170/AnsiballZ_stat.py'
Jan 23 09:59:42 compute-2 sudo[118620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:42 compute-2 python3.9[118622]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:42 compute-2 sudo[118620]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:42 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:42 compute-2 sudo[118744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzakhvzwpbmmxpytootljfnhiwujowbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162381.8422322-703-87408833851170/AnsiballZ_copy.py'
Jan 23 09:59:42 compute-2 sudo[118744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:42 compute-2 python3.9[118746]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162381.8422322-703-87408833851170/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:42 compute-2 sudo[118744]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:43 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:43 compute-2 sudo[118896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afustyaijvxmybvrnnhufvztjgpnyctq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162383.123042-752-100309235418087/AnsiballZ_file.py'
Jan 23 09:59:43 compute-2 sudo[118896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:43.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:43 compute-2 python3.9[118898]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:59:43 compute-2 sudo[118896]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:43 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:43 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Scheduled restart job, restart counter is at 4.
Jan 23 09:59:43 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:59:43 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.583s CPU time.
Jan 23 09:59:43 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:59:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:59:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:43.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:59:44 compute-2 sudo[119106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chvprvzicxfhegxchepsnncpbmruyvlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162383.8366396-772-219614496771385/AnsiballZ_stat.py'
Jan 23 09:59:44 compute-2 sudo[119106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:44 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:44 compute-2 podman[119043]: 2026-01-23 09:59:44.026744341 +0000 UTC m=+0.024273455 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:59:44 compute-2 ceph-mon[75771]: pgmap v185: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:59:44 compute-2 podman[119043]: 2026-01-23 09:59:44.216784822 +0000 UTC m=+0.214313916 container create 4f256f8471d2d67936bf5479a009e13411000f6a3de9d5c1413008f3e0bb0af9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid)
Jan 23 09:59:44 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61f999682fe7ba096df068ab99db190302f37de217dbe7d7604ba685fdad3a63/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 09:59:44 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61f999682fe7ba096df068ab99db190302f37de217dbe7d7604ba685fdad3a63/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:59:44 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61f999682fe7ba096df068ab99db190302f37de217dbe7d7604ba685fdad3a63/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 09:59:44 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61f999682fe7ba096df068ab99db190302f37de217dbe7d7604ba685fdad3a63/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.tykohi-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 09:59:44 compute-2 python3.9[119108]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:44 compute-2 sudo[119106]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:44 compute-2 sudo[119235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldqajnfxenhgyeoxevbvvcnbfziitdqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162383.8366396-772-219614496771385/AnsiballZ_copy.py'
Jan 23 09:59:44 compute-2 sudo[119235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:44 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:44 compute-2 podman[119043]: 2026-01-23 09:59:44.770917798 +0000 UTC m=+0.768446902 container init 4f256f8471d2d67936bf5479a009e13411000f6a3de9d5c1413008f3e0bb0af9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1)
Jan 23 09:59:44 compute-2 podman[119043]: 2026-01-23 09:59:44.77624462 +0000 UTC m=+0.773773714 container start 4f256f8471d2d67936bf5479a009e13411000f6a3de9d5c1413008f3e0bb0af9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:59:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:44 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 09:59:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:44 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 09:59:44 compute-2 bash[119043]: 4f256f8471d2d67936bf5479a009e13411000f6a3de9d5c1413008f3e0bb0af9
Jan 23 09:59:44 compute-2 python3.9[119237]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162383.8366396-772-219614496771385/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:44 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:59:44 compute-2 sudo[119235]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:45 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:45 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 09:59:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:45 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 09:59:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:45 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 09:59:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:45 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 09:59:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:45 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 09:59:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:45 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:59:45 compute-2 sudo[119426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-indfxsiqgrxnsuzadhntmocovynqlqfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162385.173175-815-233484996151107/AnsiballZ_file.py'
Jan 23 09:59:45 compute-2 sudo[119426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:45.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:45 compute-2 python3.9[119428]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:59:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:45 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:45 compute-2 sudo[119426]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:45 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:59:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:45.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:46 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:46 compute-2 sudo[119579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aejmkfunbrhcukwtodtjvgupznwdzvcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162385.854258-834-205280207384106/AnsiballZ_stat.py'
Jan 23 09:59:46 compute-2 sudo[119579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:46 compute-2 ceph-mon[75771]: pgmap v186: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 09:59:46 compute-2 python3.9[119581]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:46 compute-2 sudo[119579]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:46 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:46 compute-2 sudo[119703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsdjavbxbrnlirzmdgtmwmoohmpytefa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162385.854258-834-205280207384106/AnsiballZ_copy.py'
Jan 23 09:59:46 compute-2 sudo[119703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:46 compute-2 python3.9[119705]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162385.854258-834-205280207384106/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:46 compute-2 sudo[119703]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:47 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:47 compute-2 sudo[119855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgiotbewlvqhcpmlcfjtwxuszyeqavfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162387.1634464-882-125802422258511/AnsiballZ_file.py'
Jan 23 09:59:47 compute-2 sudo[119855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:59:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:47.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:59:47 compute-2 python3.9[119857]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:59:47 compute-2 sudo[119855]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:47 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 09:59:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:48.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 09:59:48 compute-2 sudo[120008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tptvdghkrboogkkjrexwllnichmffrkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162387.8260078-906-205682684487946/AnsiballZ_stat.py'
Jan 23 09:59:48 compute-2 sudo[120008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:48 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:48 compute-2 python3.9[120010]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:48 compute-2 sudo[120008]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:48 compute-2 ceph-mon[75771]: pgmap v187: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 09:59:48 compute-2 sudo[120132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwvgmoajmthzpdlupogmwrcgbwhdibmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162387.8260078-906-205682684487946/AnsiballZ_copy.py'
Jan 23 09:59:48 compute-2 sudo[120132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:48 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:48 compute-2 python3.9[120134]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162387.8260078-906-205682684487946/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:48 compute-2 sudo[120132]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:49 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:49 compute-2 sudo[120284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zinputwmsgebnfgnvewoqxqvjrrdylut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162389.1035843-948-51047243813399/AnsiballZ_file.py'
Jan 23 09:59:49 compute-2 sudo[120284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:49 compute-2 ceph-mon[75771]: pgmap v188: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:59:49 compute-2 python3.9[120286]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:59:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:49.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:49 compute-2 sudo[120284]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:49 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:49 compute-2 sudo[120291]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 09:59:49 compute-2 sudo[120291]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:59:49 compute-2 sudo[120291]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:59:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:50.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:59:50 compute-2 sudo[120462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krclnlzyikrcqiqstglhhiaohosqicpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162389.7764578-971-195493562681958/AnsiballZ_stat.py'
Jan 23 09:59:50 compute-2 sudo[120462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:50 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:50 compute-2 python3.9[120464]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:50 compute-2 sudo[120462]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:59:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:59:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:59:50 compute-2 sudo[120586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtqpbfqowlmxppbkztrjnndkxmyrxjyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162389.7764578-971-195493562681958/AnsiballZ_copy.py'
Jan 23 09:59:50 compute-2 sudo[120586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:50 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:50 compute-2 python3.9[120588]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162389.7764578-971-195493562681958/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:50 compute-2 sudo[120586]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:50 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:59:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:51 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:51 compute-2 sudo[120738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqhixfwqggavxgygbulookguovmnzjkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162391.0505505-1016-191901824522814/AnsiballZ_file.py'
Jan 23 09:59:51 compute-2 sudo[120738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:51 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 09:59:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:51 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:59:51 compute-2 python3.9[120740]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:59:51 compute-2 sudo[120738]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:59:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:51.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:59:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:51 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:51 compute-2 ceph-mon[75771]: pgmap v189: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:59:51 compute-2 sudo[120892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avvkvigyhaouawyvecvhzpoveltefuos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162391.6924145-1042-182145774667505/AnsiballZ_stat.py'
Jan 23 09:59:51 compute-2 sudo[120892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:52.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:52 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:52 compute-2 python3.9[120894]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:52 compute-2 sudo[120892]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:52 compute-2 sudo[121016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-coenkbjauwumgfizznnrntgirowcujwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162391.6924145-1042-182145774667505/AnsiballZ_copy.py'
Jan 23 09:59:52 compute-2 sudo[121016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:52 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:52 compute-2 python3.9[121018]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162391.6924145-1042-182145774667505/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:52 compute-2 sudo[121016]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:52 compute-2 ceph-mon[75771]: pgmap v190: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 595 B/s wr, 1 op/s
Jan 23 09:59:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:53 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:59:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:53.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:59:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:53 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:54.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:54 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:54 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:55 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:55 compute-2 ceph-mon[75771]: pgmap v191: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 595 B/s wr, 1 op/s
Jan 23 09:59:55 compute-2 sshd-session[114602]: Connection closed by 192.168.122.30 port 37162
Jan 23 09:59:55 compute-2 sshd-session[114599]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:59:55 compute-2 systemd[1]: session-47.scope: Deactivated successfully.
Jan 23 09:59:55 compute-2 systemd[1]: session-47.scope: Consumed 23.274s CPU time.
Jan 23 09:59:55 compute-2 systemd-logind[786]: Session 47 logged out. Waiting for processes to exit.
Jan 23 09:59:55 compute-2 systemd-logind[786]: Removed session 47.
Jan 23 09:59:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 09:59:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:55.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 09:59:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:55 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:55 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:59:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:56.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:56 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:56 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095956 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 09:59:56 compute-2 ceph-mon[75771]: pgmap v192: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 595 B/s wr, 1 op/s
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:57 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 09:59:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:57.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:57 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e10000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:58.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:58 compute-2 sudo[121064]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:59:58 compute-2 sudo[121064]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:59:58 compute-2 sudo[121064]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:58 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:58 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:59 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:59 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e08001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:59 compute-2 ceph-mon[75771]: pgmap v193: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 936 B/s wr, 3 op/s
Jan 23 09:59:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:59 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 09:59:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:59.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:59 2026: (VI_0) received an invalid passwd!
Jan 23 09:59:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:59 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:00.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:00 compute-2 ceph-mon[75771]: Health detail: HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore
Jan 23 10:00:00 compute-2 ceph-mon[75771]: [WRN] BLUESTORE_SLOW_OP_ALERT: 1 OSD(s) experiencing slow operations in BlueStore
Jan 23 10:00:00 compute-2 ceph-mon[75771]:      osd.1 observed slow operation indications in BlueStore
Jan 23 10:00:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:00 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:00:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:01 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100001 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:00:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:01 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e08001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:00:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:01.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:00:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:01 compute-2 ceph-mon[75771]: pgmap v194: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 851 B/s wr, 2 op/s
Jan 23 10:00:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:01 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:00:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:02.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:00:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:02 compute-2 sshd-session[121093]: Accepted publickey for zuul from 192.168.122.30 port 38938 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 10:00:02 compute-2 systemd-logind[786]: New session 48 of user zuul.
Jan 23 10:00:02 compute-2 systemd[1]: Started Session 48 of User zuul.
Jan 23 10:00:02 compute-2 sshd-session[121093]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 10:00:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:02 compute-2 ceph-mon[75771]: pgmap v195: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 851 B/s wr, 2 op/s
Jan 23 10:00:02 compute-2 sudo[121247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zemclyoyghbdoxjwwkewctsulmjrlsds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162402.3723712-23-229237456364749/AnsiballZ_file.py'
Jan 23 10:00:02 compute-2 sudo[121247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:03 compute-2 python3.9[121249]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:03 compute-2 sudo[121247]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:03 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:03 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:00:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:03.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:00:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:03 compute-2 sudo[121399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adzvegisxizkakwjyyrztzrjoeapthzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162403.3548565-59-245424719842205/AnsiballZ_stat.py'
Jan 23 10:00:03 compute-2 sudo[121399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:03 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e08001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:04.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:04 compute-2 python3.9[121401]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:00:04 compute-2 sudo[121399]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:04 compute-2 sudo[121524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odkgqjvmxsghzxqrhqfbumotrfuodkzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162403.3548565-59-245424719842205/AnsiballZ_copy.py'
Jan 23 10:00:04 compute-2 sudo[121524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:04 compute-2 python3.9[121526]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162403.3548565-59-245424719842205/.source.conf _original_basename=ceph.conf follow=False checksum=c8d90d44a83782ff84a3d797d09c3b204e2d1c61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:04 compute-2 sudo[121524]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:05 compute-2 sudo[121676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qllptofoiqbinhxezokaskgoypeicazv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162404.917758-59-15592740715908/AnsiballZ_stat.py'
Jan 23 10:00:05 compute-2 sudo[121676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:05 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:05 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:05 compute-2 python3.9[121678]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:00:05 compute-2 sudo[121676]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:05 compute-2 ceph-mon[75771]: pgmap v196: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 341 B/s wr, 1 op/s
Jan 23 10:00:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:05.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:05 : epoch 69734690 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:00:05 compute-2 sudo[121799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-busyuyaugtymrtjcisdxjfcwclhnmblk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162404.917758-59-15592740715908/AnsiballZ_copy.py'
Jan 23 10:00:05 compute-2 sudo[121799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:05 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:05 compute-2 python3.9[121801]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162404.917758-59-15592740715908/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=a6273c4bda164a032598e5e81cbd7f6e9c0876d5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:05 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:00:06 compute-2 sudo[121799]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:06.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:06 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:00:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:06 compute-2 sshd-session[121097]: Connection closed by 192.168.122.30 port 38938
Jan 23 10:00:06 compute-2 sshd-session[121093]: pam_unix(sshd:session): session closed for user zuul
Jan 23 10:00:06 compute-2 systemd[1]: session-48.scope: Deactivated successfully.
Jan 23 10:00:06 compute-2 systemd[1]: session-48.scope: Consumed 2.714s CPU time.
Jan 23 10:00:06 compute-2 systemd-logind[786]: Session 48 logged out. Waiting for processes to exit.
Jan 23 10:00:06 compute-2 systemd-logind[786]: Removed session 48.
Jan 23 10:00:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:07 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e08001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:07 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:07.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:07 compute-2 ceph-mon[75771]: pgmap v197: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 341 B/s wr, 1 op/s
Jan 23 10:00:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:07 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:00:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:08.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:00:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:08 : epoch 69734690 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:00:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:08 : epoch 69734690 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:00:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:08 : epoch 69734690 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:00:08 compute-2 ceph-mon[75771]: pgmap v198: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 852 B/s wr, 2 op/s
Jan 23 10:00:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:09 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:09 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:00:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:09.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:00:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:09 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:10.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:10 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:00:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:11 compute-2 ceph-mon[75771]: pgmap v199: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 511 B/s wr, 1 op/s
Jan 23 10:00:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:11 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:11 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:11.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:11 : epoch 69734690 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:00:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:11 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:12.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:12 compute-2 sshd-session[121834]: Accepted publickey for zuul from 192.168.122.30 port 45078 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 10:00:12 compute-2 systemd-logind[786]: New session 49 of user zuul.
Jan 23 10:00:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:12 compute-2 systemd[1]: Started Session 49 of User zuul.
Jan 23 10:00:12 compute-2 sshd-session[121834]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 10:00:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:13 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:13 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:13 compute-2 ceph-mon[75771]: pgmap v200: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:00:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:13.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:13 compute-2 python3.9[121987]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 10:00:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:13 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:00:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:14.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:00:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:14 compute-2 sudo[122143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sowhbmgvjozsbjejhghdlbvldxbtwlnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162414.4479446-59-30344824547511/AnsiballZ_file.py'
Jan 23 10:00:14 compute-2 sudo[122143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:15 compute-2 python3.9[122145]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:00:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:15 compute-2 sudo[122143]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:15 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:15 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:15 compute-2 sudo[122295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdgjkfjfweykqawrjtibcptpamjlbcev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162415.278792-59-97612511885021/AnsiballZ_file.py'
Jan 23 10:00:15 compute-2 sudo[122295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:15 compute-2 ceph-mon[75771]: pgmap v201: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:00:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:00:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:15.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:00:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:15 compute-2 python3.9[122297]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:00:15 compute-2 sudo[122295]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:15 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:15 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:00:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:00:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:16.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:00:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:16 compute-2 python3.9[122449]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 10:00:17 compute-2 ceph-mon[75771]: pgmap v202: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:00:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:17 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:17 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:17 compute-2 sudo[122599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvijvxneotzanzzwkrnltgxkfiytgfbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162417.0612245-128-68645679771899/AnsiballZ_seboolean.py'
Jan 23 10:00:17 compute-2 sudo[122599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:00:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:17.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:00:17 compute-2 python3.9[122601]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 23 10:00:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:17 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:18.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:18 compute-2 sudo[122603]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:00:18 compute-2 sudo[122603]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:00:18 compute-2 sudo[122603]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100018 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:00:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:19 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:19 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:19.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:19 compute-2 ceph-mon[75771]: pgmap v203: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:00:19 compute-2 sudo[122599]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:19 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:20.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:20 compute-2 sudo[122785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwyybmgqviwramvkypatxtadvefauhue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162420.2690494-158-251196377528837/AnsiballZ_setup.py'
Jan 23 10:00:20 compute-2 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 23 10:00:20 compute-2 sudo[122785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:20 compute-2 python3.9[122787]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 10:00:21 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:00:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:21 compute-2 sudo[122785]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:21 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:21 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:21 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:00:21 compute-2 ceph-mon[75771]: pgmap v204: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 511 B/s wr, 2 op/s
Jan 23 10:00:21 compute-2 sudo[122869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdruafpkbmuoxuncedjryxszvknloxji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162420.2690494-158-251196377528837/AnsiballZ_dnf.py'
Jan 23 10:00:21 compute-2 sudo[122869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:21.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:21 compute-2 python3.9[122871]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 10:00:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:21 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:22.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:22 compute-2 ceph-mon[75771]: pgmap v205: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 511 B/s wr, 2 op/s
Jan 23 10:00:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:23 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:23 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:23 compute-2 sudo[122869]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:23.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:23 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:24.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:24 compute-2 sudo[123026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyejucpsfjvqdzgmktpzvmxunghfjabh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162423.6864822-194-23806276624409/AnsiballZ_systemd.py'
Jan 23 10:00:24 compute-2 sudo[123026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:24 compute-2 python3.9[123028]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 10:00:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:24 compute-2 sudo[123026]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:25 compute-2 ceph-mon[75771]: pgmap v206: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:00:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:25 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:25 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:25 compute-2 sudo[123181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryzeawbhogrlmaaizzvutzruccuwuhae ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769162424.9843209-219-18031659168699/AnsiballZ_edpm_nftables_snippet.py'
Jan 23 10:00:25 compute-2 sudo[123181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:25 compute-2 python3[123183]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                             rule:
                                               proto: udp
                                               dport: 4789
                                           - rule_name: 119 neutron geneve networks
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               state: ["UNTRACKED"]
                                           - rule_name: 120 neutron geneve networks no conntrack
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               table: raw
                                               chain: OUTPUT
                                               jump: NOTRACK
                                               action: append
                                               state: []
                                           - rule_name: 121 neutron geneve networks no conntrack
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               table: raw
                                               chain: PREROUTING
                                               jump: NOTRACK
                                               action: append
                                               state: []
                                            dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 23 10:00:25 compute-2 sudo[123181]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:25.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:25 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:00:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:26.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:00:26 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:00:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:26 compute-2 sudo[123335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdnhrjbgouvedtfgwgurgfctvtexuoba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162426.216929-246-99451178986104/AnsiballZ_file.py'
Jan 23 10:00:26 compute-2 sudo[123335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:26 compute-2 ceph-mon[75771]: pgmap v207: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:00:26 compute-2 python3.9[123337]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:26 compute-2 sudo[123335]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:27 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:27 compute-2 sudo[123487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aezlislsnghwwyysqxbnwpvhrhquwoss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162426.8691654-269-45565510346478/AnsiballZ_stat.py'
Jan 23 10:00:27 compute-2 sudo[123487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:27 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:27 compute-2 python3.9[123489]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:00:27 compute-2 sudo[123487]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:27.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:27 compute-2 sudo[123565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hckphjlryiyhpayptmeyqxcbxioecwfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162426.8691654-269-45565510346478/AnsiballZ_file.py'
Jan 23 10:00:27 compute-2 sudo[123565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:27 compute-2 python3.9[123567]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:27 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:27 compute-2 sudo[123565]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:28.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:28 compute-2 sudo[123719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iiupbaqkhjlljcpncwdcrngyagcdjrwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162428.2182467-305-225603842578826/AnsiballZ_stat.py'
Jan 23 10:00:28 compute-2 sudo[123719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:28 compute-2 python3.9[123721]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:00:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:28 compute-2 sudo[123719]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:28 compute-2 sudo[123797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygcokanmsfjhyxihrdxsbsuxdgqdynsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162428.2182467-305-225603842578826/AnsiballZ_file.py'
Jan 23 10:00:28 compute-2 sudo[123797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:29 compute-2 python3.9[123799]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.gjyi_25y recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:29 compute-2 sudo[123797]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:29 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:29 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:29 compute-2 ceph-mon[75771]: pgmap v208: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:00:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:00:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:29.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:00:29 compute-2 sudo[123951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwmmfoyvqzlhmvthtwgilokjwjzjkvvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162429.429031-341-55500910632074/AnsiballZ_stat.py'
Jan 23 10:00:29 compute-2 sudo[123951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:29 compute-2 python3.9[123953]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:00:29 compute-2 sudo[123951]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:29 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6dec000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:30.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:30 compute-2 sudo[124031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmncaltrngqkhiazyjbsrmmvpehjelwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162429.429031-341-55500910632074/AnsiballZ_file.py'
Jan 23 10:00:30 compute-2 sudo[124031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:30 compute-2 python3.9[124033]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:30 compute-2 sudo[124031]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:31 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:00:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:31 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:31 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:31 compute-2 sudo[124183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iypsvuyveoxzbemlxtgedyzypongcsph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162430.8870294-380-246959745901840/AnsiballZ_command.py'
Jan 23 10:00:31 compute-2 sudo[124183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:31 compute-2 ceph-mon[75771]: pgmap v209: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:00:31 compute-2 python3.9[124185]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:00:31 compute-2 sudo[124183]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:00:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:31.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:00:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:31 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:32.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:32 compute-2 sudo[124337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wksbuebcblpmbdfntpvpaynqzzjdzzmn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769162431.82553-405-183990297861891/AnsiballZ_edpm_nftables_from_files.py'
Jan 23 10:00:32 compute-2 sudo[124337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:32 compute-2 python3[124340]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 10:00:32 compute-2 sudo[124337]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:32 compute-2 ceph-mon[75771]: pgmap v210: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:00:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:33 compute-2 sudo[124490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irapefuowsvhdktdiermxxztsprswkuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162432.7393332-429-13514922869121/AnsiballZ_stat.py'
Jan 23 10:00:33 compute-2 sudo[124490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:33 compute-2 python3.9[124492]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:00:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:33 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6dec0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:33 compute-2 sudo[124490]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:33 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:33.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:33 compute-2 sudo[124615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvlybfnzvrsayeuppftxlzxueftvlqly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162432.7393332-429-13514922869121/AnsiballZ_copy.py'
Jan 23 10:00:33 compute-2 sudo[124615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:33 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:34 compute-2 python3.9[124617]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162432.7393332-429-13514922869121/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:34 compute-2 sudo[124615]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:34.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:34 compute-2 sudo[124769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpoiulxdeybpppuwdgpuzkygffbfbdjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162434.2852414-474-106652998090111/AnsiballZ_stat.py'
Jan 23 10:00:34 compute-2 sudo[124769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:35 compute-2 python3.9[124771]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:00:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:35 compute-2 sudo[124769]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:35 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:35 compute-2 ceph-mon[75771]: pgmap v211: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:00:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:00:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:35 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6dec0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:35 compute-2 sudo[124894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzvkwxtycjmxbsujdljinivvxoemogxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162434.2852414-474-106652998090111/AnsiballZ_copy.py'
Jan 23 10:00:35 compute-2 sudo[124894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:35.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:35 compute-2 python3.9[124896]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162434.2852414-474-106652998090111/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:35 compute-2 sudo[124894]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:35 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:36.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:36 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:00:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:36 compute-2 sudo[125048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuiyqvjzogbfqhyzzhfbarticmnyrbxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162435.9479222-519-222780313790841/AnsiballZ_stat.py'
Jan 23 10:00:36 compute-2 sudo[125048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:36 compute-2 python3.9[125050]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:00:36 compute-2 sudo[125048]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:36 compute-2 sudo[125173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohmmshauygqzbulmcckahtfjwxlnsqko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162435.9479222-519-222780313790841/AnsiballZ_copy.py'
Jan 23 10:00:36 compute-2 sudo[125173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:37 compute-2 python3.9[125175]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162435.9479222-519-222780313790841/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:37 compute-2 sudo[125173]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:37 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:37 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:37 compute-2 ceph-mon[75771]: pgmap v212: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:00:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:37.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:37 compute-2 sudo[125325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpbickprfojslufjnoyxarfedtgumscv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162437.420987-563-249308349665253/AnsiballZ_stat.py'
Jan 23 10:00:37 compute-2 sudo[125325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:37 compute-2 python3.9[125327]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:00:37 compute-2 sudo[125325]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:37 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6dec0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:38.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:38 compute-2 sudo[125387]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:00:38 compute-2 sudo[125387]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:00:38 compute-2 sudo[125387]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:38 compute-2 sudo[125477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aszauvvtsccwnkgrxkgvbtkuycwejdgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162437.420987-563-249308349665253/AnsiballZ_copy.py'
Jan 23 10:00:38 compute-2 sudo[125477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:38 compute-2 python3.9[125479]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162437.420987-563-249308349665253/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:38 compute-2 sudo[125477]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:38 compute-2 ceph-mon[75771]: pgmap v213: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:00:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:39 compute-2 sudo[125629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqfzabhxvuzejsfqekfzlacywdrmwqdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162438.7279024-609-115942111541063/AnsiballZ_stat.py'
Jan 23 10:00:39 compute-2 sudo[125629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:39 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:39 compute-2 python3.9[125631]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:00:39 compute-2 sudo[125629]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:39 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:00:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:39.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:00:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:39 compute-2 sudo[125754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxelqnhfuuufhjzvarlvogadfikjlkqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162438.7279024-609-115942111541063/AnsiballZ_copy.py'
Jan 23 10:00:39 compute-2 sudo[125754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:39 compute-2 python3.9[125756]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162438.7279024-609-115942111541063/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:39 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:39 compute-2 sudo[125754]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:40.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:40 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:00:40 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Cumulative writes: 2187 writes, 13K keys, 2187 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.06 MB/s
                                           Cumulative WAL: 2187 writes, 2187 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2187 writes, 13K keys, 2187 commit groups, 1.0 writes per commit group, ingest: 36.20 MB, 0.06 MB/s
                                           Interval WAL: 2187 writes, 2187 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     89.8      0.23              0.10         6    0.038       0      0       0.0       0.0
                                             L6      1/0   12.61 MB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   3.0     59.4     52.6      1.17              0.41         5    0.233     22K   2300       0.0       0.0
                                            Sum      1/0   12.61 MB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   4.0     49.7     58.7      1.40              0.51        11    0.127     22K   2300       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   4.0     49.7     58.8      1.39              0.51        10    0.139     22K   2300       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   0.0     59.4     52.6      1.17              0.41         5    0.233     22K   2300       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     90.7      0.23              0.10         5    0.045       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.020, interval 0.020
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.08 GB write, 0.14 MB/s write, 0.07 GB read, 0.12 MB/s read, 1.4 seconds
                                           Interval compaction: 0.08 GB write, 0.14 MB/s write, 0.07 GB read, 0.12 MB/s read, 1.4 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c6513709b0#2 capacity: 304.00 MB usage: 1.49 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.00017 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(87,1.28 MB,0.421373%) FilterBlock(11,73.42 KB,0.0235859%) IndexBlock(11,142.14 KB,0.0456609%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Jan 23 10:00:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:40 compute-2 sudo[125908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amyifhuphgqulblrqzoxszupblukddaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162440.1728232-653-269724643841842/AnsiballZ_file.py'
Jan 23 10:00:40 compute-2 sudo[125908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:40 compute-2 python3.9[125910]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:40 compute-2 sudo[125908]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:41 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:00:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:41 compute-2 sudo[126060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptlciupvecxrymgapovfnapaxsegbioi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162440.9151528-677-54399784140503/AnsiballZ_command.py'
Jan 23 10:00:41 compute-2 sudo[126060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:41 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6dec002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:41 compute-2 ceph-mon[75771]: pgmap v214: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:00:41 compute-2 python3.9[126062]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:00:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:41 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:41 compute-2 sudo[126060]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:00:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:41.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:00:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:41 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:42.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:42 compute-2 sudo[126216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxwkkjupagyfljxcsrbfaigymiosyjhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162441.688684-701-138860063581898/AnsiballZ_blockinfile.py'
Jan 23 10:00:42 compute-2 sudo[126216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:42 compute-2 python3.9[126218]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:42 compute-2 sudo[126216]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:42 compute-2 sudo[126369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqppfpquwmlwarkucatyevqckoxearju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162442.6034503-729-117030827505750/AnsiballZ_command.py'
Jan 23 10:00:42 compute-2 sudo[126369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:43 compute-2 python3.9[126371]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:00:43 compute-2 sudo[126369]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:43 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:43 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:43 compute-2 ceph-mon[75771]: pgmap v215: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:00:43 compute-2 sudo[126522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdylcszgnwmrphpvkyvkfqrejtmzclsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162443.3263628-753-211411624750666/AnsiballZ_stat.py'
Jan 23 10:00:43 compute-2 sudo[126522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:00:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:43.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:00:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:43 compute-2 python3.9[126524]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:00:43 compute-2 sudo[126522]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:43 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:00:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:44.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:00:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:44 compute-2 sudo[126678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnvjwutdyjajuvyxecgybgfgpmhteytk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162444.2419515-776-224893856691936/AnsiballZ_command.py'
Jan 23 10:00:44 compute-2 sudo[126678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:44 compute-2 python3.9[126680]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:00:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:44 compute-2 sudo[126678]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:45 compute-2 ceph-mon[75771]: pgmap v216: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:00:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:45 compute-2 sudo[126833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzyjdzuxszytetcqssgrsrltjludlkpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162444.96232-800-123498943198469/AnsiballZ_file.py'
Jan 23 10:00:45 compute-2 sudo[126833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:45 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:45 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6dec002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:45 compute-2 python3.9[126835]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:45 compute-2 sudo[126833]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:00:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:45.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:00:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:45 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:46.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:46 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:00:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:46 compute-2 python3.9[126987]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 10:00:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:47 compute-2 ceph-mon[75771]: pgmap v217: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:00:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:47 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:47 compute-2 ceph-osd[81231]: bluestore.MempoolThread fragmentation_score=0.000021 took=0.000131s
Jan 23 10:00:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:47 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:47.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:47 compute-2 sudo[127138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atllocypgqmbjgtbxktfosfpiatavscu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162447.5212452-920-50041368089030/AnsiballZ_command.py'
Jan 23 10:00:47 compute-2 sudo[127138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:47 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6dec003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:48 compute-2 python3.9[127140]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:8d:1d:08:09" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:00:48 compute-2 ovs-vsctl[127142]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:8d:1d:08:09 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 23 10:00:48 compute-2 sudo[127138]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:48.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:48 compute-2 sudo[127293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emsqgdexmszbbqvepatxjurlmdczbfuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162448.5640059-947-83640119955353/AnsiballZ_command.py'
Jan 23 10:00:48 compute-2 sudo[127293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:49 compute-2 python3.9[127295]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ovs-vsctl show | grep -q "Manager"
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:00:49 compute-2 sudo[127293]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:49 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:49 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:49 compute-2 sudo[127448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzouqmqrzzyyefwaglzzsbswkrapmpxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162449.2828398-972-89825015214462/AnsiballZ_command.py'
Jan 23 10:00:49 compute-2 sudo[127448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:49.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:49 compute-2 sudo[127451]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:00:49 compute-2 sudo[127451]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:00:49 compute-2 sudo[127451]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:49 compute-2 sudo[127477]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Jan 23 10:00:49 compute-2 sudo[127477]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:00:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:49 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:50.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:50 compute-2 sudo[127477]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:51 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6dec003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:51 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6dec003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:00:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:51.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:00:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:51 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:52.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:53 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:53 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6dec003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:00:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:53.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:00:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:53 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:54.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:55 compute-2 python3.9[127450]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:00:55 compute-2 ovs-vsctl[127528]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 23 10:00:55 compute-2 sudo[127448]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:55 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:55 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:55 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:00:55 compute-2 ceph-mon[75771]: pgmap v218: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:00:55 compute-2 sudo[127618]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:00:55 compute-2 sudo[127618]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:00:55 compute-2 sudo[127618]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:55.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:55 compute-2 sudo[127670]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:00:55 compute-2 sudo[127670]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:00:55 compute-2 python3.9[127728]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:00:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:55 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:56.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:56 compute-2 sudo[127670]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:56 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:00:56 compute-2 ceph-mon[75771]: pgmap v219: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:00:56 compute-2 ceph-mon[75771]: pgmap v220: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:00:56 compute-2 ceph-mon[75771]: pgmap v221: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:00:56 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:00:56 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:00:56 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:00:56 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:00:56 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:00:56 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:00:56 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:00:56 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:00:56 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:00:56 compute-2 sudo[127913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhzdncuikziryczduycbquoxustodhbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162456.2728667-1023-196544994100293/AnsiballZ_file.py'
Jan 23 10:00:56 compute-2 sudo[127913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:56 compute-2 python3.9[127915]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:00:56 compute-2 sudo[127913]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:57 compute-2 sudo[128065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgivoajqmjellzanfaobezsxbahaqazs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162457.074266-1047-6868312240099/AnsiballZ_stat.py'
Jan 23 10:00:57 compute-2 sudo[128065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6dec003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:57 compute-2 ceph-mon[75771]: pgmap v222: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:00:57 compute-2 python3.9[128067]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:00:57 compute-2 sudo[128065]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:57.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:57 compute-2 sudo[128143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpfdjmrnhhrziskkimyttsfdyriwmmgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162457.074266-1047-6868312240099/AnsiballZ_file.py'
Jan 23 10:00:57 compute-2 sudo[128143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:57 compute-2 python3.9[128145]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:00:58 compute-2 sudo[128143]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:58.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:58 compute-2 sudo[128223]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:00:58 compute-2 sudo[128223]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:00:58 compute-2 sudo[128223]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:58 compute-2 sudo[128322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofaelklqnxqzfzcqadoqykhkjginvmxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162458.151526-1047-274297126959165/AnsiballZ_stat.py'
Jan 23 10:00:58 compute-2 sudo[128322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:58 compute-2 ceph-mon[75771]: pgmap v223: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:00:58 compute-2 python3.9[128324]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:00:58 compute-2 sudo[128322]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:58 compute-2 sudo[128400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vglfqqtcdstcdyfavmssscqefxhcziqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162458.151526-1047-274297126959165/AnsiballZ_file.py'
Jan 23 10:00:58 compute-2 sudo[128400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:59 compute-2 python3.9[128402]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:00:59 compute-2 sudo[128400]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:59 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:59 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:59 compute-2 sudo[128553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lajvbmxmaylevyyqeqvgyhpdyvqjxlli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162459.4186223-1116-9665355086309/AnsiballZ_file.py'
Jan 23 10:00:59 compute-2 sudo[128553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:00:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:00:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:59.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:00:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:00:59 compute-2 python3.9[128555]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:59 compute-2 sudo[128553]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:59 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df80014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:01:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:00.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:01:00 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:01:00 compute-2 sudo[128707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxmdlmvfkoqcmwqpdqulqxpusxfnnrcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162460.0957692-1140-159076715520950/AnsiballZ_stat.py'
Jan 23 10:01:00 compute-2 sudo[128707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:00 compute-2 python3.9[128709]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:01:00 compute-2 sudo[128707]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:00 compute-2 sudo[128785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpufododdmrvrrgrcsvfffwpndwmmnvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162460.0957692-1140-159076715520950/AnsiballZ_file.py'
Jan 23 10:01:00 compute-2 sudo[128785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:01 compute-2 CROND[128789]: (root) CMD (run-parts /etc/cron.hourly)
Jan 23 10:01:01 compute-2 python3.9[128787]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:01:01 compute-2 run-parts[128792]: (/etc/cron.hourly) starting 0anacron
Jan 23 10:01:01 compute-2 ceph-mon[75771]: pgmap v224: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:01:01 compute-2 run-parts[128798]: (/etc/cron.hourly) finished 0anacron
Jan 23 10:01:01 compute-2 CROND[128788]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 23 10:01:01 compute-2 sudo[128785]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:01 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:01 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:01:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:01.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:01:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:01 compute-2 sudo[128949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkmsktmxwrfbsdjrwwvbshjvkudmjbzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162461.5686617-1176-90190461209603/AnsiballZ_stat.py'
Jan 23 10:01:01 compute-2 sudo[128949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:01 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:02 compute-2 python3.9[128951]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:01:02 compute-2 sudo[128949]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:02.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:02 compute-2 sudo[129029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwcenzgvpzjdwrnfacnadulobiuqbquq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162461.5686617-1176-90190461209603/AnsiballZ_file.py'
Jan 23 10:01:02 compute-2 sudo[129029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:02 compute-2 python3.9[129031]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:01:02 compute-2 sudo[129029]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:03 compute-2 sudo[129181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwmimdegfisldjtkwtppwkjftrmbxody ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162462.8168151-1212-102485978231342/AnsiballZ_systemd.py'
Jan 23 10:01:03 compute-2 sudo[129181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:03 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df80014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:03 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:03 compute-2 python3.9[129183]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:01:03 compute-2 systemd[1]: Reloading.
Jan 23 10:01:03 compute-2 systemd-rc-local-generator[129210]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:01:03 compute-2 systemd-sysv-generator[129213]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:01:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:03.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:03 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:04 compute-2 ceph-mon[75771]: pgmap v225: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:01:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:04.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:04 compute-2 sudo[129181]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:04 compute-2 sudo[129394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvostldncwvznblcekjjbgqpmnqfcknc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162464.4725933-1236-280649523518801/AnsiballZ_stat.py'
Jan 23 10:01:04 compute-2 sudo[129353]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:01:04 compute-2 sudo[129394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:04 compute-2 sudo[129353]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:01:04 compute-2 sudo[129353]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:04 compute-2 python3.9[129399]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:01:05 compute-2 sudo[129394]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:05 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:05 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8002260 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:05 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:01:05 compute-2 ceph-mon[75771]: pgmap v226: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:01:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:01:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:01:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:01:05 compute-2 sudo[129476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvgwdsqgzltbgdndhkueweznoyjxfmqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162464.4725933-1236-280649523518801/AnsiballZ_file.py'
Jan 23 10:01:05 compute-2 sudo[129476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:01:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:05.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:01:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:05 compute-2 python3.9[129478]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:01:05 compute-2 sudo[129476]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:05 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:06.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:06 compute-2 sudo[129630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axlauejgqtxqyyfsvzwkybtbgwurvzsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162466.1698492-1271-44002365106826/AnsiballZ_stat.py'
Jan 23 10:01:06 compute-2 sudo[129630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:06 compute-2 python3.9[129632]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:01:06 compute-2 ceph-mon[75771]: pgmap v227: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:01:06 compute-2 sudo[129630]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:06 compute-2 sudo[129708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlinwgdpkgbxcwmmrjydepxwkjpodctt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162466.1698492-1271-44002365106826/AnsiballZ_file.py'
Jan 23 10:01:06 compute-2 sudo[129708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:07 compute-2 python3.9[129710]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:01:07 compute-2 sudo[129708]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:07 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:07 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:07 compute-2 sudo[129860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykcvvzwwiqcortokqgotfauehsmormlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162467.355675-1308-190747138046546/AnsiballZ_systemd.py'
Jan 23 10:01:07 compute-2 sudo[129860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:07.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:07 compute-2 python3.9[129862]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:01:07 compute-2 systemd[1]: Reloading.
Jan 23 10:01:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:07 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:08 compute-2 systemd-rc-local-generator[129889]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:01:08 compute-2 systemd-sysv-generator[129894]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:01:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:08.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:08 compute-2 systemd[1]: Starting Create netns directory...
Jan 23 10:01:08 compute-2 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 10:01:08 compute-2 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 10:01:08 compute-2 systemd[1]: Finished Create netns directory.
Jan 23 10:01:08 compute-2 sudo[129860]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:09 compute-2 sudo[130055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utmfvubiztxyajvrsaibbjetndjtrnbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162468.8240857-1338-154412361029196/AnsiballZ_file.py'
Jan 23 10:01:09 compute-2 sudo[130055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:09 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:09 compute-2 python3.9[130057]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:01:09 compute-2 sudo[130055]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:09 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:01:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:09.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:01:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:09 compute-2 ceph-mon[75771]: pgmap v228: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:01:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:09 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:10 compute-2 sudo[130208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtsjnxbmrsufiqgjttozcfosqmucfmoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162469.924888-1362-159345511888092/AnsiballZ_stat.py'
Jan 23 10:01:10 compute-2 sudo[130208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:10.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:10 compute-2 python3.9[130210]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:01:10 compute-2 sudo[130208]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:10 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:01:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:10 compute-2 ceph-mon[75771]: pgmap v229: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:01:10 compute-2 sudo[130332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqwblwigpmweqcpjmfbhcqhrlosnpely ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162469.924888-1362-159345511888092/AnsiballZ_copy.py'
Jan 23 10:01:10 compute-2 sudo[130332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:11 compute-2 python3.9[130334]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162469.924888-1362-159345511888092/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:01:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:11 compute-2 sudo[130332]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:11 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8002ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:11 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8002ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:01:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:11.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:01:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:11 compute-2 sudo[130485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qggcvdzdjbpgiwjtiytvdrfsreltlwmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162471.6532698-1413-181865177987378/AnsiballZ_file.py'
Jan 23 10:01:11 compute-2 sudo[130485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:11 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:12 compute-2 python3.9[130487]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:01:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:12 compute-2 sudo[130485]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:01:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:12.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:01:12 compute-2 sudo[130638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-coxugrsqggyffyhzerapbphxqbbzcxkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162472.3660915-1437-202626401144164/AnsiballZ_file.py'
Jan 23 10:01:12 compute-2 sudo[130638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:12 compute-2 python3.9[130640]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:01:12 compute-2 sudo[130638]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:13 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:13 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:13 compute-2 sudo[130790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bokzisdubeltcvukowtvvmxfcsnjyhxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162473.1873362-1460-259907696972181/AnsiballZ_stat.py'
Jan 23 10:01:13 compute-2 sudo[130790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:13 compute-2 ceph-mon[75771]: pgmap v230: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:01:13 compute-2 python3.9[130792]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:01:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:13.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:13 compute-2 sudo[130790]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:14 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8002ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:14 compute-2 sudo[130914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxpbnkobasvscrecdmoberugiekvendr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162473.1873362-1460-259907696972181/AnsiballZ_copy.py'
Jan 23 10:01:14 compute-2 sudo[130914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:14.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:14 compute-2 python3.9[130916]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162473.1873362-1460-259907696972181/.source.json _original_basename=.yzj27p_h follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:01:14 compute-2 sudo[130914]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:14 compute-2 ceph-mon[75771]: pgmap v231: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:01:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:14 compute-2 python3.9[131067]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:01:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:15 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:15 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de40032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:15 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:01:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:15.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:16 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:16.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:17 compute-2 ceph-mon[75771]: pgmap v232: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:01:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:17 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8003060 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:17 compute-2 sudo[131490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mapnpuuyoprvqvjeztenbxyhisysiuiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162476.899134-1581-169362818086697/AnsiballZ_container_config_data.py'
Jan 23 10:01:17 compute-2 sudo[131490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:17 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:17 compute-2 python3.9[131492]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 23 10:01:17 compute-2 sudo[131490]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:17.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:18 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:18.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:18 compute-2 sudo[131594]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:01:18 compute-2 sudo[131594]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:01:18 compute-2 sudo[131594]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:18 compute-2 sudo[131669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjhkusqhgaqnqyjpwmxhrqzrrhwkziks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162477.9910097-1613-111504540504508/AnsiballZ_container_config_hash.py'
Jan 23 10:01:18 compute-2 sudo[131669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:18 compute-2 python3.9[131671]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 10:01:18 compute-2 sudo[131669]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:19 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:19 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8003200 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:19.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:19 compute-2 sudo[131821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iulgwigjefyvnyhznxvcsctvaczkotco ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769162479.2898195-1643-275253707962189/AnsiballZ_edpm_container_manage.py'
Jan 23 10:01:19 compute-2 sudo[131821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:19 compute-2 ceph-mon[75771]: pgmap v233: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:01:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:20 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:20 compute-2 python3[131823]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 10:01:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:20.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:20 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:01:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:21 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:21 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:01:21 compute-2 ceph-mon[75771]: pgmap v234: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:01:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:21 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:01:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:21.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:01:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:22 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:01:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:22.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:01:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:22 compute-2 ceph-mon[75771]: pgmap v235: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:01:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:23 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:23 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:01:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:23.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:01:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:24 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:01:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:24.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:01:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:24 compute-2 ceph-mon[75771]: pgmap v236: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:01:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100124 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:01:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:25 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8003200 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:25 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8003200 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:25 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:01:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:25.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:25 compute-2 podman[131835]: 2026-01-23 10:01:25.920347942 +0000 UTC m=+5.695967912 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Jan 23 10:01:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:26 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8003200 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:26 compute-2 podman[131961]: 2026-01-23 10:01:26.072366932 +0000 UTC m=+0.051834974 container create 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:01:26 compute-2 podman[131961]: 2026-01-23 10:01:26.046566334 +0000 UTC m=+0.026034396 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Jan 23 10:01:26 compute-2 python3[131823]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Jan 23 10:01:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:26 compute-2 sudo[131821]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:26.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:27 compute-2 ceph-mon[75771]: pgmap v237: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:01:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:27 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:27 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8003200 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:27.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:28 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:28.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:29 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:29 compute-2 sudo[132153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sujxwfrtdruaplpeflokvjpjsdgnegcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162489.1126678-1668-13139656406505/AnsiballZ_stat.py'
Jan 23 10:01:29 compute-2 sudo[132153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:29 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003cd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:29 compute-2 python3.9[132155]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:01:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:29 compute-2 ceph-mon[75771]: pgmap v238: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:01:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:29.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:29 compute-2 sudo[132153]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:30 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8003200 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:30.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:30 compute-2 sudo[132308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aurubswqmalgzlilvfnvrhmxrgscjsaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162490.0119736-1694-249667426248506/AnsiballZ_file.py'
Jan 23 10:01:30 compute-2 sudo[132308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:30 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:01:30 compute-2 python3.9[132310]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:01:30 compute-2 sudo[132308]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:30 compute-2 sudo[132385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sldqdzjfwigncjqexzcnalnjsvlezkuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162490.0119736-1694-249667426248506/AnsiballZ_stat.py'
Jan 23 10:01:30 compute-2 sudo[132385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:30 compute-2 python3.9[132387]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:01:30 compute-2 sudo[132385]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:30 compute-2 ceph-mon[75771]: pgmap v239: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:01:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:31 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:31 compute-2 sudo[132537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsvxiefaxethahjehdrqpwcrfgsiwzjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162490.9878035-1694-208021842378569/AnsiballZ_copy.py'
Jan 23 10:01:31 compute-2 sudo[132537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:31 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:31 compute-2 python3.9[132539]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769162490.9878035-1694-208021842378569/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:01:31 compute-2 sudo[132537]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:01:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:31.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:01:31 compute-2 sudo[132613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egpdlzwofzhyjpqeenlxobqjvnvylehz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162490.9878035-1694-208021842378569/AnsiballZ_systemd.py'
Jan 23 10:01:31 compute-2 sudo[132613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:32 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:32 compute-2 python3.9[132615]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 10:01:32 compute-2 systemd[1]: Reloading.
Jan 23 10:01:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:32.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:32 compute-2 systemd-rc-local-generator[132641]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:01:32 compute-2 systemd-sysv-generator[132646]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:01:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:32 compute-2 sudo[132613]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:33 compute-2 sudo[132727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pypzbbyrvlhicwdkpkbzvhjibbtfdzcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162490.9878035-1694-208021842378569/AnsiballZ_systemd.py'
Jan 23 10:01:33 compute-2 sudo[132727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:33 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003cf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:33 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:01:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:33.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:01:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:34 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:01:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:34.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:01:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:34 : epoch 69734690 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:01:34 compute-2 python3.9[132729]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:01:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:34 compute-2 systemd[1]: Reloading.
Jan 23 10:01:34 compute-2 systemd-rc-local-generator[132761]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:01:34 compute-2 systemd-sysv-generator[132765]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:01:34 compute-2 ceph-mon[75771]: pgmap v240: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:01:35 compute-2 systemd[1]: Starting ovn_controller container...
Jan 23 10:01:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:35 compute-2 systemd[1]: Started libcrun container.
Jan 23 10:01:35 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4214ae656bf086d4cf887d9c946eede27210a1dc28094e67a6a5cb3b8ef610d9/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 23 10:01:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:35 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:35 compute-2 systemd[1]: Started /usr/bin/podman healthcheck run 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087.
Jan 23 10:01:35 compute-2 podman[132773]: 2026-01-23 10:01:35.389581131 +0000 UTC m=+0.312058878 container init 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:01:35 compute-2 ovn_controller[132789]: + sudo -E kolla_set_configs
Jan 23 10:01:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:35 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003d10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:35 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:01:35 compute-2 podman[132773]: 2026-01-23 10:01:35.419463444 +0000 UTC m=+0.341941171 container start 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 10:01:35 compute-2 edpm-start-podman-container[132773]: ovn_controller
Jan 23 10:01:35 compute-2 systemd[1]: Created slice User Slice of UID 0.
Jan 23 10:01:35 compute-2 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 23 10:01:35 compute-2 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 23 10:01:35 compute-2 systemd[1]: Starting User Manager for UID 0...
Jan 23 10:01:35 compute-2 podman[132795]: 2026-01-23 10:01:35.503586872 +0000 UTC m=+0.071516537 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 23 10:01:35 compute-2 systemd[132824]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Jan 23 10:01:35 compute-2 systemd[1]: 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087-30cbb83d56284482.service: Main process exited, code=exited, status=1/FAILURE
Jan 23 10:01:35 compute-2 systemd[1]: 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087-30cbb83d56284482.service: Failed with result 'exit-code'.
Jan 23 10:01:35 compute-2 edpm-start-podman-container[132772]: Creating additional drop-in dependency for "ovn_controller" (7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087)
Jan 23 10:01:35 compute-2 systemd[1]: Reloading.
Jan 23 10:01:35 compute-2 systemd-rc-local-generator[132870]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:01:35 compute-2 systemd-sysv-generator[132876]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:01:35 compute-2 systemd[132824]: Queued start job for default target Main User Target.
Jan 23 10:01:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:35 compute-2 systemd[132824]: Created slice User Application Slice.
Jan 23 10:01:35 compute-2 systemd[132824]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 23 10:01:35 compute-2 systemd[132824]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 10:01:35 compute-2 systemd[132824]: Reached target Paths.
Jan 23 10:01:35 compute-2 systemd[132824]: Reached target Timers.
Jan 23 10:01:35 compute-2 systemd[132824]: Starting D-Bus User Message Bus Socket...
Jan 23 10:01:35 compute-2 systemd[132824]: Starting Create User's Volatile Files and Directories...
Jan 23 10:01:35 compute-2 systemd[132824]: Finished Create User's Volatile Files and Directories.
Jan 23 10:01:35 compute-2 systemd[132824]: Listening on D-Bus User Message Bus Socket.
Jan 23 10:01:35 compute-2 systemd[132824]: Reached target Sockets.
Jan 23 10:01:35 compute-2 systemd[132824]: Reached target Basic System.
Jan 23 10:01:35 compute-2 systemd[132824]: Reached target Main User Target.
Jan 23 10:01:35 compute-2 systemd[132824]: Startup finished in 234ms.
Jan 23 10:01:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:01:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:35.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:01:35 compute-2 systemd[1]: Started User Manager for UID 0.
Jan 23 10:01:35 compute-2 systemd[1]: Started ovn_controller container.
Jan 23 10:01:35 compute-2 systemd[1]: Started Session c1 of User root.
Jan 23 10:01:35 compute-2 sudo[132727]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:36 compute-2 ovn_controller[132789]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 10:01:36 compute-2 ovn_controller[132789]: INFO:__main__:Validating config file
Jan 23 10:01:36 compute-2 ovn_controller[132789]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 10:01:36 compute-2 ovn_controller[132789]: INFO:__main__:Writing out command to execute
Jan 23 10:01:36 compute-2 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 23 10:01:36 compute-2 ovn_controller[132789]: ++ cat /run_command
Jan 23 10:01:36 compute-2 ovn_controller[132789]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 23 10:01:36 compute-2 ovn_controller[132789]: + ARGS=
Jan 23 10:01:36 compute-2 ovn_controller[132789]: + sudo kolla_copy_cacerts
Jan 23 10:01:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:36 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:36 compute-2 systemd[1]: Started Session c2 of User root.
Jan 23 10:01:36 compute-2 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 23 10:01:36 compute-2 ovn_controller[132789]: + [[ ! -n '' ]]
Jan 23 10:01:36 compute-2 ovn_controller[132789]: + . kolla_extend_start
Jan 23 10:01:36 compute-2 ovn_controller[132789]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 23 10:01:36 compute-2 ovn_controller[132789]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 23 10:01:36 compute-2 ovn_controller[132789]: + umask 0022
Jan 23 10:01:36 compute-2 ovn_controller[132789]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 23 10:01:36 compute-2 NetworkManager[48964]: <info>  [1769162496.0907] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Jan 23 10:01:36 compute-2 NetworkManager[48964]: <info>  [1769162496.0916] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 10:01:36 compute-2 NetworkManager[48964]: <warn>  [1769162496.0918] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 10:01:36 compute-2 NetworkManager[48964]: <info>  [1769162496.0927] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 23 10:01:36 compute-2 NetworkManager[48964]: <info>  [1769162496.0933] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Jan 23 10:01:36 compute-2 NetworkManager[48964]: <info>  [1769162496.0938] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 23 10:01:36 compute-2 kernel: br-int: entered promiscuous mode
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00014|main|INFO|OVS feature set changed, force recompute.
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00022|main|INFO|OVS feature set changed, force recompute.
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 23 10:01:36 compute-2 systemd-udevd[132921]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 10:01:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00001|statctrl(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00002|rconn(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 10:01:36 compute-2 ovn_controller[132789]: 2026-01-23T10:01:36Z|00003|rconn(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 10:01:36 compute-2 NetworkManager[48964]: <info>  [1769162496.1847] manager: (ovn-eb059b-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Jan 23 10:01:36 compute-2 NetworkManager[48964]: <info>  [1769162496.1857] manager: (ovn-170ec8-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Jan 23 10:01:36 compute-2 NetworkManager[48964]: <info>  [1769162496.1864] manager: (ovn-57e418-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Jan 23 10:01:36 compute-2 kernel: genev_sys_6081: entered promiscuous mode
Jan 23 10:01:36 compute-2 NetworkManager[48964]: <info>  [1769162496.2011] device (genev_sys_6081): carrier: link connected
Jan 23 10:01:36 compute-2 NetworkManager[48964]: <info>  [1769162496.2014] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/22)
Jan 23 10:01:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:01:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:36.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:01:36 compute-2 ceph-mon[75771]: pgmap v241: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 596 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:01:36 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:01:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:37 compute-2 python3.9[133052]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 23 10:01:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:37 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:37 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:37.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:38 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003d30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:38 compute-2 ceph-mon[75771]: pgmap v242: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 596 B/s rd, 170 B/s wr, 0 op/s
Jan 23 10:01:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:38.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:38 compute-2 sudo[133204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywrgmknbrqagqqpwgnbslrkasrvjjnjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162497.7788699-1830-160890388242940/AnsiballZ_stat.py'
Jan 23 10:01:38 compute-2 sudo[133204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:38 compute-2 sudo[133207]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:01:38 compute-2 sudo[133207]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:01:38 compute-2 sudo[133207]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:38 compute-2 python3.9[133206]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:01:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:38 : epoch 69734690 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:01:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:38 : epoch 69734690 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:01:38 compute-2 sudo[133204]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:38 compute-2 sudo[133352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgetyynybzyrojxzjqanqxkydnacmidi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162497.7788699-1830-160890388242940/AnsiballZ_copy.py'
Jan 23 10:01:38 compute-2 sudo[133352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:39 compute-2 python3.9[133354]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162497.7788699-1830-160890388242940/.source.yaml _original_basename=.p8utvsxh follow=False checksum=a80724acad465d51ee59522dfe4a3a5c05876d7d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:01:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:39 compute-2 sudo[133352]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:39 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:39 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:39 compute-2 sudo[133504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjqrydehuivhboovzbyeifkwpdwriluf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162499.409657-1874-174565540185149/AnsiballZ_command.py'
Jan 23 10:01:39 compute-2 sudo[133504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:39.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:39 compute-2 python3.9[133506]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:01:39 compute-2 ovs-vsctl[133507]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 23 10:01:39 compute-2 sudo[133504]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:40 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:40.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:40 compute-2 ceph-mon[75771]: pgmap v243: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 10:01:40 compute-2 sudo[133659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyluykcwwzivzdsjkxlepuexgzitzqbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162500.1255572-1899-7932926957564/AnsiballZ_command.py'
Jan 23 10:01:40 compute-2 sudo[133659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:40 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:01:40 compute-2 python3.9[133661]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:01:40 compute-2 ovs-vsctl[133663]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 23 10:01:40 compute-2 sudo[133659]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:41 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:41 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:41 compute-2 sudo[133814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhpdujvlosvkkwruaombejsbclodaebd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162501.1905372-1940-227252708411884/AnsiballZ_command.py'
Jan 23 10:01:41 compute-2 sudo[133814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:41 compute-2 ceph-mon[75771]: pgmap v244: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 10:01:41 compute-2 python3.9[133816]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:01:41 compute-2 ovs-vsctl[133817]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 23 10:01:41 compute-2 sudo[133814]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:01:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:41.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:01:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:42 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:42 compute-2 sshd-session[121837]: Connection closed by 192.168.122.30 port 45078
Jan 23 10:01:42 compute-2 sshd-session[121834]: pam_unix(sshd:session): session closed for user zuul
Jan 23 10:01:42 compute-2 systemd[1]: session-49.scope: Deactivated successfully.
Jan 23 10:01:42 compute-2 systemd[1]: session-49.scope: Consumed 59.517s CPU time.
Jan 23 10:01:42 compute-2 systemd-logind[786]: Session 49 logged out. Waiting for processes to exit.
Jan 23 10:01:42 compute-2 systemd-logind[786]: Removed session 49.
Jan 23 10:01:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:42.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:42 : epoch 69734690 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:01:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:43 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:43 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003d70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:43 compute-2 ceph-mon[75771]: pgmap v245: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:01:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:01:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:43.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:01:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:44 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:44.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:44 compute-2 ceph-mon[75771]: pgmap v246: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:01:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:45 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:45 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:45 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:01:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:45.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:46 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:46.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:46 compute-2 systemd[1]: Stopping User Manager for UID 0...
Jan 23 10:01:46 compute-2 systemd[132824]: Activating special unit Exit the Session...
Jan 23 10:01:46 compute-2 systemd[132824]: Stopped target Main User Target.
Jan 23 10:01:46 compute-2 systemd[132824]: Stopped target Basic System.
Jan 23 10:01:46 compute-2 systemd[132824]: Stopped target Paths.
Jan 23 10:01:46 compute-2 systemd[132824]: Stopped target Sockets.
Jan 23 10:01:46 compute-2 systemd[132824]: Stopped target Timers.
Jan 23 10:01:46 compute-2 systemd[132824]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 10:01:46 compute-2 systemd[132824]: Closed D-Bus User Message Bus Socket.
Jan 23 10:01:46 compute-2 systemd[132824]: Stopped Create User's Volatile Files and Directories.
Jan 23 10:01:46 compute-2 systemd[132824]: Removed slice User Application Slice.
Jan 23 10:01:46 compute-2 systemd[132824]: Reached target Shutdown.
Jan 23 10:01:46 compute-2 systemd[132824]: Finished Exit the Session.
Jan 23 10:01:46 compute-2 systemd[132824]: Reached target Exit the Session.
Jan 23 10:01:46 compute-2 systemd[1]: user@0.service: Deactivated successfully.
Jan 23 10:01:46 compute-2 systemd[1]: Stopped User Manager for UID 0.
Jan 23 10:01:46 compute-2 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 23 10:01:46 compute-2 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 23 10:01:46 compute-2 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 23 10:01:46 compute-2 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 23 10:01:46 compute-2 systemd[1]: Removed slice User Slice of UID 0.
Jan 23 10:01:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100146 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:01:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:47 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:47 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:47 compute-2 ceph-mon[75771]: pgmap v247: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:01:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:47.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:48 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:48 compute-2 sshd-session[133852]: Accepted publickey for zuul from 192.168.122.30 port 43630 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 10:01:48 compute-2 systemd-logind[786]: New session 51 of user zuul.
Jan 23 10:01:48 compute-2 systemd[1]: Started Session 51 of User zuul.
Jan 23 10:01:48 compute-2 sshd-session[133852]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 10:01:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:01:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:48.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:01:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:49 compute-2 ceph-mon[75771]: pgmap v248: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 852 B/s wr, 3 op/s
Jan 23 10:01:49 compute-2 python3.9[134006]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 10:01:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:49 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df00038c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:49 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:49.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:50 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003dd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:50.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:50 compute-2 sudo[134162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrpzgmvinearqcmkowcnfbemuvpyegib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162509.8766708-59-77844213739198/AnsiballZ_file.py'
Jan 23 10:01:50 compute-2 sudo[134162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:50 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:01:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:01:50 compute-2 python3.9[134164]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:01:50 compute-2 sudo[134162]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:50 compute-2 sudo[134314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcazmmynrknozxrxawgvdldrpjtiwwrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162510.692869-59-19150850683936/AnsiballZ_file.py'
Jan 23 10:01:50 compute-2 sudo[134314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:51 compute-2 python3.9[134316]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:01:51 compute-2 sudo[134314]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:51 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:51 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df00038c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:51 compute-2 ceph-mon[75771]: pgmap v249: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 425 B/s wr, 1 op/s
Jan 23 10:01:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:01:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:51.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:01:51 compute-2 sudo[134466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyvfmemoievilyhqmchxgojqnzjvdwjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162511.4230855-59-73004694743831/AnsiballZ_file.py'
Jan 23 10:01:51 compute-2 sudo[134466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:52 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:52 compute-2 python3.9[134469]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:01:52 compute-2 sudo[134466]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:52.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:52 compute-2 sudo[134620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znoxnfnoeolvtblyfhcbpsapjicdbirg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162512.267371-59-21785378551869/AnsiballZ_file.py'
Jan 23 10:01:52 compute-2 sudo[134620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:52 compute-2 python3.9[134622]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:01:52 compute-2 sudo[134620]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:53 compute-2 sudo[134772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzaoevvclwhdbwjxrqklpuaflfiyoqvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162512.9033818-59-90053048257476/AnsiballZ_file.py'
Jan 23 10:01:53 compute-2 sudo[134772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:53 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:53 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:53 compute-2 python3.9[134774]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:01:53 compute-2 sudo[134772]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:01:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:53.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:01:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:54 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df00038c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:54.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:54 compute-2 ceph-mon[75771]: pgmap v250: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 425 B/s wr, 1 op/s
Jan 23 10:01:54 compute-2 python3.9[134926]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 10:01:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:55 compute-2 sudo[135077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmjpegjxwugmdrfcxvcclvhxinlifbop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162514.7656302-191-190671304838948/AnsiballZ_seboolean.py'
Jan 23 10:01:55 compute-2 sudo[135077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:55 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:55 compute-2 python3.9[135079]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 23 10:01:55 compute-2 kernel: ganesha.nfsd[121050]: segfault at 50 ip 00007f6e9b73932e sp 00007f6e2dffa210 error 4 in libntirpc.so.5.8[7f6e9b71e000+2c000] likely on CPU 6 (core 0, socket 6)
Jan 23 10:01:55 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 10:01:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:55 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy ignored for local
Jan 23 10:01:55 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:01:55 compute-2 systemd[1]: Started Process Core Dump (PID 135080/UID 0).
Jan 23 10:01:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:55.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:56 compute-2 sudo[135077]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:56.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:56 compute-2 python3.9[135233]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:01:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:57 compute-2 systemd-coredump[135081]: Process 119239 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 43:
                                                    #0  0x00007f6e9b73932e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Jan 23 10:01:57 compute-2 systemd[1]: systemd-coredump@4-135080-0.service: Deactivated successfully.
Jan 23 10:01:57 compute-2 systemd[1]: systemd-coredump@4-135080-0.service: Consumed 1.841s CPU time.
Jan 23 10:01:57 compute-2 podman[135332]: 2026-01-23 10:01:57.434834522 +0000 UTC m=+0.034190508 container died 4f256f8471d2d67936bf5479a009e13411000f6a3de9d5c1413008f3e0bb0af9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 10:01:57 compute-2 systemd[1]: var-lib-containers-storage-overlay-61f999682fe7ba096df068ab99db190302f37de217dbe7d7604ba685fdad3a63-merged.mount: Deactivated successfully.
Jan 23 10:01:57 compute-2 podman[135332]: 2026-01-23 10:01:57.478211748 +0000 UTC m=+0.077567734 container remove 4f256f8471d2d67936bf5479a009e13411000f6a3de9d5c1413008f3e0bb0af9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:01:57 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Main process exited, code=exited, status=139/n/a
Jan 23 10:01:57 compute-2 python3.9[135368]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162516.3327856-215-139474388768614/.source follow=False _original_basename=haproxy.j2 checksum=1daf285be4abb25cbd7ba376734de140aac9aefe backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:01:57 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 10:01:57 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 2.156s CPU time.
Jan 23 10:01:57 compute-2 ceph-mon[75771]: pgmap v251: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:01:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:57.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:58.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:58 compute-2 sudo[135427]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:01:58 compute-2 sudo[135427]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:01:58 compute-2 sudo[135427]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:59 compute-2 python3.9[135577]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:01:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:59 compute-2 python3.9[135698]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162518.6094809-260-79548490163396/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:01:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:01:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:01:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:59.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:02:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:00.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:02:00 compute-2 ceph-mon[75771]: pgmap v252: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:02:00 compute-2 ceph-mon[75771]: pgmap v253: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 340 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:02:00 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:02:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:00 compute-2 sudo[135850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtefiedxnsjkzvkooojkaybuyweflkjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162520.4842458-311-50694257288482/AnsiballZ_setup.py'
Jan 23 10:02:00 compute-2 sudo[135850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:01 compute-2 python3.9[135852]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 10:02:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:01 compute-2 sudo[135850]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:01 compute-2 ceph-mon[75771]: pgmap v254: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:02:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100201 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:02:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:01 compute-2 sudo[135934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feomtopjgmcfanxjxpvhowszsihplooc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162520.4842458-311-50694257288482/AnsiballZ_dnf.py'
Jan 23 10:02:01 compute-2 sudo[135934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:02:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:01.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:02:01 compute-2 python3.9[135936]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 10:02:02 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:02:02 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Cumulative writes: 5367 writes, 23K keys, 5367 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 5367 writes, 783 syncs, 6.85 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 5367 writes, 23K keys, 5367 commit groups, 1.0 writes per commit group, ingest: 18.76 MB, 0.03 MB/s
                                           Interval WAL: 5367 writes, 783 syncs, 6.85 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 9.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 9.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 9.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 9.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 9.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 9.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 9.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb09b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb09b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb09b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 9.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 9.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Jan 23 10:02:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:02.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:03 compute-2 ceph-mon[75771]: pgmap v255: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:02:03 compute-2 sudo[135934]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:02:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:03.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:02:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:02:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:04.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:02:04 compute-2 ceph-mon[75771]: pgmap v256: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:02:04 compute-2 sudo[136091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyskfxadcyvfwrnxwkelqwzkmcthnsjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162524.113774-347-79562700262185/AnsiballZ_systemd.py'
Jan 23 10:02:04 compute-2 sudo[136091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:04 compute-2 sudo[136094]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:02:04 compute-2 sudo[136094]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:02:04 compute-2 sudo[136094]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:04 compute-2 sudo[136119]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:02:04 compute-2 sudo[136119]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:02:05 compute-2 python3.9[136093]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 10:02:05 compute-2 sudo[136091]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:05 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:02:05 compute-2 sudo[136119]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:02:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:02:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:02:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:02:05 compute-2 ovn_controller[132789]: 2026-01-23T10:02:05Z|00025|memory|INFO|17280 kB peak resident set size after 29.6 seconds
Jan 23 10:02:05 compute-2 ovn_controller[132789]: 2026-01-23T10:02:05Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Jan 23 10:02:05 compute-2 podman[136300]: 2026-01-23 10:02:05.685091942 +0000 UTC m=+0.101186394 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 23 10:02:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:02:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:05.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:02:05 compute-2 python3.9[136341]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:02:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:06.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:06 compute-2 python3.9[136474]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162525.3347979-371-168894186340437/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:02:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:06 compute-2 python3.9[136625]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:02:07 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:02:07 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:02:07 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:02:07 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:02:07 compute-2 ceph-mon[75771]: pgmap v257: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:02:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:07 compute-2 python3.9[136746]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162526.498535-371-18692316727513/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:02:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:02:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:07.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:02:07 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Scheduled restart job, restart counter is at 5.
Jan 23 10:02:07 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:02:07 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 2.156s CPU time.
Jan 23 10:02:07 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 10:02:08 compute-2 podman[136816]: 2026-01-23 10:02:08.014764017 +0000 UTC m=+0.043632463 container create 258224a89dca4f951aae15c4aabbdf8f7e57bade3608b553a304805da5c9f4f9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 10:02:08 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce9718135be574e551ecb587204459d33867a504fdc3c71066c66db68a17e904/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 10:02:08 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce9718135be574e551ecb587204459d33867a504fdc3c71066c66db68a17e904/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 10:02:08 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce9718135be574e551ecb587204459d33867a504fdc3c71066c66db68a17e904/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:02:08 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce9718135be574e551ecb587204459d33867a504fdc3c71066c66db68a17e904/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.tykohi-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:02:08 compute-2 podman[136816]: 2026-01-23 10:02:08.073788704 +0000 UTC m=+0.102657170 container init 258224a89dca4f951aae15c4aabbdf8f7e57bade3608b553a304805da5c9f4f9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 10:02:08 compute-2 podman[136816]: 2026-01-23 10:02:08.080987914 +0000 UTC m=+0.109856350 container start 258224a89dca4f951aae15c4aabbdf8f7e57bade3608b553a304805da5c9f4f9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 10:02:08 compute-2 bash[136816]: 258224a89dca4f951aae15c4aabbdf8f7e57bade3608b553a304805da5c9f4f9
Jan 23 10:02:08 compute-2 podman[136816]: 2026-01-23 10:02:07.994541841 +0000 UTC m=+0.023410307 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 10:02:08 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:02:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:08 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 10:02:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:08 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 10:02:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:08 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 10:02:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:08 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 10:02:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:08 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 10:02:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:08 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 10:02:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:08 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 10:02:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:08 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:02:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:08.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:08 compute-2 python3.9[136999]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:02:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:09 compute-2 ceph-mon[75771]: pgmap v258: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:02:09 compute-2 python3.9[137120]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162528.3329885-503-60628928061541/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:02:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:09.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:09 compute-2 python3.9[137270]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:02:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:10.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:10 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:02:10 compute-2 python3.9[137392]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162529.51031-503-199991504748083/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:02:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:11 compute-2 python3.9[137543]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:02:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:11 compute-2 ceph-mon[75771]: pgmap v259: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:02:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:02:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:11.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:02:11 compute-2 sudo[137695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwpormicsvviayhbguclmdmvbexhseph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162531.5446262-618-116736102228164/AnsiballZ_file.py'
Jan 23 10:02:11 compute-2 sudo[137695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:12 compute-2 python3.9[137697]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:02:12 compute-2 sudo[137695]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:12.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:12 compute-2 sudo[137849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epsxuqbkhfxefavgibztoadlgxdlxxql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162532.2633414-641-259303532077658/AnsiballZ_stat.py'
Jan 23 10:02:12 compute-2 sudo[137849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:12 compute-2 python3.9[137851]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:02:12 compute-2 sudo[137849]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:12 compute-2 sudo[137927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywqgreuwvtuiwiwswwlwpexdsoysytgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162532.2633414-641-259303532077658/AnsiballZ_file.py'
Jan 23 10:02:12 compute-2 sudo[137927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:13 compute-2 python3.9[137929]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:02:13 compute-2 sudo[137927]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:13 compute-2 sudo[138006]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:02:13 compute-2 sudo[138006]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:02:13 compute-2 sudo[138006]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:13 compute-2 ceph-mon[75771]: pgmap v260: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:02:13 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:02:13 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:02:13 compute-2 sudo[138104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eurxkqaydhvxzxicuueoelnliqmkcrpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162533.2891512-641-115392350914703/AnsiballZ_stat.py'
Jan 23 10:02:13 compute-2 sudo[138104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:13 compute-2 python3.9[138106]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:02:13 compute-2 sudo[138104]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:02:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:13.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:02:13 compute-2 sudo[138183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnitgdfapwrprhefehzikxdfnkcsiyqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162533.2891512-641-115392350914703/AnsiballZ_file.py'
Jan 23 10:02:13 compute-2 sudo[138183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:14 compute-2 python3.9[138185]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:02:14 compute-2 sudo[138183]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:02:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:14.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:02:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:14 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:02:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:14 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:02:14 compute-2 ceph-mon[75771]: pgmap v261: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:02:14 compute-2 sudo[138336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yijjhymjcdgbiihfndxfesulvfpgxlxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162534.4745903-711-60895983340119/AnsiballZ_file.py'
Jan 23 10:02:14 compute-2 sudo[138336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:14 compute-2 python3.9[138338]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:02:14 compute-2 sudo[138336]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:15 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:02:15 compute-2 sudo[138488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvwhkesyupwibqfrjlvpkrsjmnwepvfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162535.1866205-735-53391703297340/AnsiballZ_stat.py'
Jan 23 10:02:15 compute-2 sudo[138488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:15 compute-2 python3.9[138490]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:02:15 compute-2 sudo[138488]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:15.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:15 compute-2 sudo[138566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uejpttxkvftqwohyhpqbgilldxysaatj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162535.1866205-735-53391703297340/AnsiballZ_file.py'
Jan 23 10:02:15 compute-2 sudo[138566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:16 compute-2 python3.9[138568]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:02:16 compute-2 sudo[138566]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:16.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:16 compute-2 sudo[138720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpxtumixgothjmxzosbxvifvnrkvkuqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162536.3385963-771-6660113997424/AnsiballZ_stat.py'
Jan 23 10:02:16 compute-2 sudo[138720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:16 compute-2 python3.9[138722]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:02:16 compute-2 sudo[138720]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:17 compute-2 ceph-mon[75771]: pgmap v262: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 170 B/s wr, 0 op/s
Jan 23 10:02:17 compute-2 sudo[138798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctinuxazbedoyogrexrsqoiuhrchkguc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162536.3385963-771-6660113997424/AnsiballZ_file.py'
Jan 23 10:02:17 compute-2 sudo[138798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:17 compute-2 python3.9[138800]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:02:17 compute-2 sudo[138798]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:17 compute-2 sudo[138950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knikuhekukmjxjllhhowhqmhoxccrmcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162537.5078065-806-65248266145715/AnsiballZ_systemd.py'
Jan 23 10:02:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:17.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:17 compute-2 sudo[138950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:18 compute-2 python3.9[138952]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:02:18 compute-2 systemd[1]: Reloading.
Jan 23 10:02:18 compute-2 systemd-sysv-generator[138983]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:02:18 compute-2 systemd-rc-local-generator[138979]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:02:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:18.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:18 compute-2 sudo[138950]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:18 compute-2 sudo[138991]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:02:18 compute-2 sudo[138991]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:02:18 compute-2 sudo[138991]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:19 compute-2 sudo[139165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utfrxklxkpptvohgqcmmkvnaqpxqsjjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162538.7548-830-166185505344242/AnsiballZ_stat.py'
Jan 23 10:02:19 compute-2 sudo[139165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:19 compute-2 python3.9[139167]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:02:19 compute-2 sudo[139165]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:19 compute-2 ceph-mon[75771]: pgmap v263: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:02:19 compute-2 sudo[139243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilgdoxdcipqcbyxccyusjobforxrfqrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162538.7548-830-166185505344242/AnsiballZ_file.py'
Jan 23 10:02:19 compute-2 sudo[139243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:19 compute-2 python3.9[139245]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:02:19 compute-2 sudo[139243]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:02:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:19.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:20 compute-2 sudo[139396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iddydsyjcqfoqlfhucfiwncmpqofcoha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162539.9424934-866-107940607307959/AnsiballZ_stat.py'
Jan 23 10:02:20 compute-2 sudo[139396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:20.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:20 compute-2 python3.9[139398]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:02:20 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:02:20 compute-2 sudo[139396]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:02:20 compute-2 sudo[139486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujtaaillflydczewrmrofsstgwvoigcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162539.9424934-866-107940607307959/AnsiballZ_file.py'
Jan 23 10:02:20 compute-2 sudo[139486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:20 compute-2 python3.9[139488]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:02:20 compute-2 sudo[139486]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:21 compute-2 sudo[139638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rigakahlktfbwlyrqcdtrqfshpiemlsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162541.0960727-903-151672062425737/AnsiballZ_systemd.py'
Jan 23 10:02:21 compute-2 sudo[139638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:21 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd960000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:21 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9540016c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:21 compute-2 ceph-mon[75771]: pgmap v264: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:02:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:21 compute-2 python3.9[139640]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:02:21 compute-2 systemd[1]: Reloading.
Jan 23 10:02:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:02:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:21.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:02:21 compute-2 systemd-rc-local-generator[139673]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:02:21 compute-2 systemd-sysv-generator[139676]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:02:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:22 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd940000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:22 compute-2 systemd[1]: Starting Create netns directory...
Jan 23 10:02:22 compute-2 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 10:02:22 compute-2 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 10:02:22 compute-2 systemd[1]: Finished Create netns directory.
Jan 23 10:02:22 compute-2 sudo[139638]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:02:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:22.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:02:22 compute-2 sudo[139839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwyidwuhrblmyvmhacaxojyfwqbidtyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162542.4770958-932-185769382494336/AnsiballZ_file.py'
Jan 23 10:02:22 compute-2 sudo[139839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:22 compute-2 python3.9[139841]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:02:22 compute-2 sudo[139839]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:23 compute-2 ceph-mon[75771]: pgmap v265: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:02:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:23 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100223 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:02:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:23 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:23 compute-2 sudo[139991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzzdbfmawvbvdbkphytygwkshglkdlhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162543.3174987-956-250022875305633/AnsiballZ_stat.py'
Jan 23 10:02:23 compute-2 sudo[139991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:23 compute-2 python3.9[139993]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:02:23 compute-2 sudo[139991]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:02:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:23.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:02:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:24 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:24 compute-2 sudo[140115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eljmezldzjhiyyyggaecfozfabatvarv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162543.3174987-956-250022875305633/AnsiballZ_copy.py'
Jan 23 10:02:24 compute-2 sudo[140115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:24 compute-2 python3.9[140117]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162543.3174987-956-250022875305633/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:02:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:24.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:24 compute-2 sudo[140115]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:25 compute-2 sudo[140268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rutptecyrtwzqvhiqzynppnsmgfgbjva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162544.8821292-1008-234187948463566/AnsiballZ_file.py'
Jan 23 10:02:25 compute-2 sudo[140268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:25 compute-2 python3.9[140270]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:02:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:25 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:25 compute-2 sudo[140268]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:25 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:02:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:25 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9440016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:25 compute-2 ceph-mon[75771]: pgmap v266: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:02:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:25.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:25 compute-2 sudo[140420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilsrvpcmjbbvsyszrvpbboystmhgasmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162545.5974255-1031-58699722180217/AnsiballZ_file.py'
Jan 23 10:02:25 compute-2 sudo[140420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:26 compute-2 python3.9[140422]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:02:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:26 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:26 compute-2 sudo[140420]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:26.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:26 compute-2 sudo[140574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpmlocusjqxixeitcjmzdkvkjsmpirjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162546.3612838-1055-748943752289/AnsiballZ_stat.py'
Jan 23 10:02:26 compute-2 sudo[140574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:26 compute-2 python3.9[140576]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:02:26 compute-2 sudo[140574]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:27 compute-2 sudo[140697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxcaqoffxdcevwvnkebqmspuxyskzwex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162546.3612838-1055-748943752289/AnsiballZ_copy.py'
Jan 23 10:02:27 compute-2 sudo[140697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:27 compute-2 python3.9[140699]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162546.3612838-1055-748943752289/.source.json _original_basename=.eahzrann follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:02:27 compute-2 sudo[140697]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:27 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9400016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:27 compute-2 ceph-mon[75771]: pgmap v267: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:02:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:27 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:27.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:28 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9440016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:28 compute-2 python3.9[140850]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:02:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:02:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:28.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:02:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:29 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:29 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd940001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:02:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:29.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:02:29 compute-2 ceph-mon[75771]: pgmap v268: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 852 B/s wr, 2 op/s
Jan 23 10:02:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:30 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:30 compute-2 sudo[141273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwkhaxgaozuelxwmhrbvqsicgporlebv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162549.7385695-1175-84925855853595/AnsiballZ_container_config_data.py'
Jan 23 10:02:30 compute-2 sudo[141273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:30.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:30 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:02:30 compute-2 python3.9[141275]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 23 10:02:30 compute-2 sudo[141273]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:30 compute-2 ceph-mon[75771]: pgmap v269: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:02:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:31 compute-2 sudo[141426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-leumlzsnfkaqykvvsaypxuixmobdxcjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162550.8932612-1208-212902539845799/AnsiballZ_container_config_hash.py'
Jan 23 10:02:31 compute-2 sudo[141426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:31 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9440016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:31 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:31 compute-2 python3.9[141428]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 10:02:31 compute-2 sudo[141426]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:02:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:31.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:02:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:32 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd940001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:32 compute-2 sudo[141580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esrzkkoopufjpxsetfmibihbxlurxgmb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769162551.8712635-1238-88650914220914/AnsiballZ_edpm_container_manage.py'
Jan 23 10:02:32 compute-2 sudo[141580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:32.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:32 compute-2 python3[141582]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 10:02:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:33 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:33 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:02:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:33.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:02:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:34 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:34 compute-2 ceph-mon[75771]: pgmap v270: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:02:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:34.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:34 compute-2 radosgw[82185]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 23 10:02:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:35 compute-2 ceph-mon[75771]: pgmap v271: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:02:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:02:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:35 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd940001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:35 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:02:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:35 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:35.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:36 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:36.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:37 compute-2 ceph-mon[75771]: pgmap v272: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 32 KiB/s rd, 0 B/s wr, 52 op/s
Jan 23 10:02:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:37 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:37 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9400032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:02:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:37.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:02:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:38 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:38.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:38 compute-2 sudo[141678]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:02:38 compute-2 sudo[141678]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:02:38 compute-2 sudo[141678]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100238 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:02:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:39 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:39 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:39.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:40 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9400032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:02:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:40.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:02:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:41 compute-2 ceph-mon[75771]: pgmap v273: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 44 KiB/s rd, 0 B/s wr, 73 op/s
Jan 23 10:02:41 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:02:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:41 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:41 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:02:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:41.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:02:42 compute-2 podman[141664]: 2026-01-23 10:02:42.043903007 +0000 UTC m=+5.715004400 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:02:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:42 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:42.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:43 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd940004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:43 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:02:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:43.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:02:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:44 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:44.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:45 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:45 compute-2 sshd-session[141740]: Invalid user  from 194.187.176.105 port 24064
Jan 23 10:02:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:45 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd940004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:45 compute-2 sshd-session[141740]: Connection closed by invalid user  194.187.176.105 port 24064 [preauth]
Jan 23 10:02:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:45.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:46 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:46.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:46 compute-2 ceph-mon[75771]: pgmap v274: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 44 KiB/s rd, 0 B/s wr, 73 op/s
Jan 23 10:02:46 compute-2 ceph-mon[75771]: pgmap v275: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 KiB/s rd, 0 B/s wr, 141 op/s
Jan 23 10:02:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:47 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:47 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:47 : epoch 69734720 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:02:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:47.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:47 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:02:47 compute-2 podman[141595]: 2026-01-23 10:02:47.941644504 +0000 UTC m=+15.208525819 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 10:02:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:48 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:48 compute-2 podman[141783]: 2026-01-23 10:02:48.096845521 +0000 UTC m=+0.053823998 container create bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:02:48 compute-2 podman[141783]: 2026-01-23 10:02:48.070493171 +0000 UTC m=+0.027471678 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 10:02:48 compute-2 python3[141582]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 10:02:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:48 compute-2 sudo[141580]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:02:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:48.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:02:48 compute-2 sudo[141970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkqyvclwzxxieyygaikrqhndeupbpqrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162568.412663-1262-69878549187696/AnsiballZ_stat.py'
Jan 23 10:02:48 compute-2 sudo[141970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:48 compute-2 ceph-mon[75771]: pgmap v276: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 KiB/s rd, 0 B/s wr, 141 op/s
Jan 23 10:02:48 compute-2 ceph-mon[75771]: pgmap v277: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 90 KiB/s rd, 85 B/s wr, 150 op/s
Jan 23 10:02:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:48 compute-2 python3.9[141972]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:02:48 compute-2 sudo[141970]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:49 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:49 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:49 compute-2 sudo[142124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zulkqidrzprtplznskordyxbqfenhocs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162569.2547867-1289-188472335606714/AnsiballZ_file.py'
Jan 23 10:02:49 compute-2 sudo[142124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:49 compute-2 python3.9[142126]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:02:49 compute-2 sudo[142124]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:02:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:49.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:02:49 compute-2 sudo[142201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojyqcgzihvjgchdjjzyhxozvnkepjesr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162569.2547867-1289-188472335606714/AnsiballZ_stat.py'
Jan 23 10:02:49 compute-2 sudo[142201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:50 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:50 compute-2 python3.9[142203]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:02:50 compute-2 sudo[142201]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:50 compute-2 ceph-mon[75771]: pgmap v278: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 59 KiB/s rd, 85 B/s wr, 97 op/s
Jan 23 10:02:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:02:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:50.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:02:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:50 : epoch 69734720 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:02:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:50 : epoch 69734720 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:02:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:50 compute-2 sudo[142353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyewcluzbrfxufnfaksrnfbqyegikvcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162570.2571516-1289-62413515576395/AnsiballZ_copy.py'
Jan 23 10:02:50 compute-2 sudo[142353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:51 compute-2 python3.9[142355]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769162570.2571516-1289-62413515576395/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:02:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:51 compute-2 sudo[142353]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:51 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:51 compute-2 sudo[142429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euggemjkgadngrsxdwhqcbhpjrjmlfmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162570.2571516-1289-62413515576395/AnsiballZ_systemd.py'
Jan 23 10:02:51 compute-2 sudo[142429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:51 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd940004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:51 compute-2 python3.9[142431]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 10:02:51 compute-2 systemd[1]: Reloading.
Jan 23 10:02:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:51 compute-2 systemd-rc-local-generator[142457]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:02:51 compute-2 systemd-sysv-generator[142460]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:02:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:02:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:51.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:02:52 compute-2 sudo[142429]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:52 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:52 compute-2 sudo[142542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyhfrorgllunangahittnyeqennayelp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162570.2571516-1289-62413515576395/AnsiballZ_systemd.py'
Jan 23 10:02:52 compute-2 sudo[142542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:02:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:52.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:02:52 compute-2 python3.9[142544]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:02:52 compute-2 systemd[1]: Reloading.
Jan 23 10:02:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:52 compute-2 systemd-rc-local-generator[142569]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:02:52 compute-2 systemd-sysv-generator[142576]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:02:52 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:02:53 compute-2 systemd[1]: Starting ovn_metadata_agent container...
Jan 23 10:02:53 compute-2 systemd[1]: Started libcrun container.
Jan 23 10:02:53 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5293bc396b427585d7816b1a4e1106196f5aa403872ea13b3594ce59e34d84cf/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 23 10:02:53 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5293bc396b427585d7816b1a4e1106196f5aa403872ea13b3594ce59e34d84cf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 10:02:53 compute-2 systemd[1]: Started /usr/bin/podman healthcheck run bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3.
Jan 23 10:02:53 compute-2 podman[142586]: 2026-01-23 10:02:53.148013865 +0000 UTC m=+0.122438221 container init bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 23 10:02:53 compute-2 ovn_metadata_agent[142601]: + sudo -E kolla_set_configs
Jan 23 10:02:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:53 compute-2 podman[142586]: 2026-01-23 10:02:53.177632345 +0000 UTC m=+0.152056691 container start bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 23 10:02:53 compute-2 edpm-start-podman-container[142586]: ovn_metadata_agent
Jan 23 10:02:53 compute-2 ovn_metadata_agent[142601]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 10:02:53 compute-2 ovn_metadata_agent[142601]: INFO:__main__:Validating config file
Jan 23 10:02:53 compute-2 ovn_metadata_agent[142601]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 10:02:53 compute-2 ovn_metadata_agent[142601]: INFO:__main__:Copying service configuration files
Jan 23 10:02:53 compute-2 ovn_metadata_agent[142601]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 23 10:02:53 compute-2 ovn_metadata_agent[142601]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 23 10:02:53 compute-2 ovn_metadata_agent[142601]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 23 10:02:53 compute-2 ovn_metadata_agent[142601]: INFO:__main__:Writing out command to execute
Jan 23 10:02:53 compute-2 ovn_metadata_agent[142601]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 23 10:02:53 compute-2 ovn_metadata_agent[142601]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 23 10:02:53 compute-2 ovn_metadata_agent[142601]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 23 10:02:53 compute-2 ovn_metadata_agent[142601]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 23 10:02:53 compute-2 ovn_metadata_agent[142601]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 23 10:02:53 compute-2 ovn_metadata_agent[142601]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 23 10:02:53 compute-2 ovn_metadata_agent[142601]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 23 10:02:53 compute-2 edpm-start-podman-container[142585]: Creating additional drop-in dependency for "ovn_metadata_agent" (bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3)
Jan 23 10:02:53 compute-2 ovn_metadata_agent[142601]: ++ cat /run_command
Jan 23 10:02:53 compute-2 ovn_metadata_agent[142601]: + CMD=neutron-ovn-metadata-agent
Jan 23 10:02:53 compute-2 ovn_metadata_agent[142601]: + ARGS=
Jan 23 10:02:53 compute-2 ovn_metadata_agent[142601]: + sudo kolla_copy_cacerts
Jan 23 10:02:53 compute-2 systemd[1]: Reloading.
Jan 23 10:02:53 compute-2 ovn_metadata_agent[142601]: + [[ ! -n '' ]]
Jan 23 10:02:53 compute-2 ovn_metadata_agent[142601]: + . kolla_extend_start
Jan 23 10:02:53 compute-2 ovn_metadata_agent[142601]: Running command: 'neutron-ovn-metadata-agent'
Jan 23 10:02:53 compute-2 ovn_metadata_agent[142601]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 23 10:02:53 compute-2 ovn_metadata_agent[142601]: + umask 0022
Jan 23 10:02:53 compute-2 ovn_metadata_agent[142601]: + exec neutron-ovn-metadata-agent
Jan 23 10:02:53 compute-2 podman[142607]: 2026-01-23 10:02:53.276660017 +0000 UTC m=+0.086109255 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:02:53 compute-2 systemd-rc-local-generator[142678]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:02:53 compute-2 systemd-sysv-generator[142681]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:02:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:53 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:53 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:53 compute-2 systemd[1]: Started ovn_metadata_agent container.
Jan 23 10:02:53 compute-2 sudo[142542]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:53.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:53 : epoch 69734720 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:02:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:54 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:54.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:55 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.425 142606 INFO neutron.common.config [-] Logging enabled!
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.425 142606 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.425 142606 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.426 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.426 142606 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.426 142606 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.427 142606 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.427 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.427 142606 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.427 142606 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.427 142606 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.427 142606 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.427 142606 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.427 142606 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.427 142606 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.428 142606 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.428 142606 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.428 142606 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.428 142606 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.428 142606 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.428 142606 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.428 142606 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.428 142606 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.429 142606 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.429 142606 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.429 142606 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.429 142606 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.429 142606 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.429 142606 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.429 142606 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.429 142606 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.429 142606 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.430 142606 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.430 142606 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.430 142606 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.430 142606 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.430 142606 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.430 142606 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.430 142606 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.430 142606 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.431 142606 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.431 142606 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.431 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.431 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.431 142606 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.431 142606 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.431 142606 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.431 142606 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.431 142606 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.431 142606 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.432 142606 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.432 142606 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.432 142606 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.432 142606 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.432 142606 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.432 142606 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.432 142606 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.432 142606 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.432 142606 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.432 142606 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.433 142606 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.433 142606 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.433 142606 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.433 142606 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.433 142606 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.433 142606 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.433 142606 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.433 142606 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.433 142606 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.434 142606 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.434 142606 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.434 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.434 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.434 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.434 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.434 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.434 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.435 142606 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.435 142606 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.435 142606 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.435 142606 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.435 142606 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.435 142606 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.435 142606 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.435 142606 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.436 142606 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.436 142606 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.436 142606 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.436 142606 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.436 142606 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.436 142606 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.436 142606 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.436 142606 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.436 142606 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.437 142606 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.437 142606 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.437 142606 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.437 142606 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.437 142606 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.437 142606 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.437 142606 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.437 142606 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.437 142606 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.437 142606 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.438 142606 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.438 142606 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.438 142606 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.438 142606 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.438 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.438 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.438 142606 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.438 142606 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.439 142606 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.439 142606 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.439 142606 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.439 142606 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.439 142606 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.439 142606 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.439 142606 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.439 142606 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.439 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.440 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.440 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.440 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.440 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.440 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.440 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.440 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.440 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.441 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.441 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.441 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.441 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.441 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.441 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.441 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.441 142606 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.442 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.442 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.442 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.442 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.442 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.442 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.442 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.442 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.442 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.442 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.443 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.443 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.443 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.443 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.443 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.443 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.443 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.443 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.444 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.444 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.444 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.444 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.444 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.444 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.444 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.444 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.444 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.444 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.445 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.445 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.445 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.445 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.445 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.445 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.445 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.446 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.446 142606 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.446 142606 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.446 142606 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.446 142606 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.446 142606 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.446 142606 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.446 142606 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.446 142606 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.447 142606 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.447 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.447 142606 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.447 142606 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.447 142606 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.447 142606 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.447 142606 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.447 142606 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.447 142606 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.448 142606 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.448 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.448 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.448 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.448 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.448 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.448 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.448 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.448 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.449 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.449 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.449 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.449 142606 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.449 142606 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.449 142606 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.449 142606 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.450 142606 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.450 142606 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.450 142606 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.450 142606 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.450 142606 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.450 142606 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.450 142606 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.450 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.451 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.451 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.451 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.451 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.451 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.451 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.451 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.451 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.452 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.452 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.452 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.452 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.452 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.452 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.452 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.452 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.452 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.453 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.453 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.453 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.453 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.453 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.453 142606 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.453 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.453 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.453 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.454 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.454 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.454 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.454 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.454 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.454 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.454 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.454 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.454 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.455 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.455 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.455 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.455 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.455 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.455 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.455 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.455 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.455 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.456 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.456 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.456 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.456 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.456 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.456 142606 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.456 142606 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.456 142606 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.457 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.457 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.457 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.457 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.457 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.457 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.457 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.457 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.457 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.458 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.458 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.458 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.458 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.458 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.458 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.458 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.459 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.459 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.459 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.459 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.459 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.459 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.459 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.460 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.460 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.460 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.460 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.460 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.460 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.460 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.461 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.461 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.461 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.461 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.461 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.461 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.461 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.461 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 23 10:02:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:55 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd964001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.472 142606 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.472 142606 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.472 142606 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.472 142606 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.473 142606 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.489 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 8fb585ea-168c-48ac-870f-617a4fa1bbde (UUID: 8fb585ea-168c-48ac-870f-617a4fa1bbde) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.520 142606 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.521 142606 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.521 142606 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.521 142606 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.525 142606 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.531 142606 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.538 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '8fb585ea-168c-48ac-870f-617a4fa1bbde'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>], external_ids={}, name=8fb585ea-168c-48ac-870f-617a4fa1bbde, nb_cfg_timestamp=1769162504105, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.539 142606 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7fefcfae0f40>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.540 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.541 142606 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.541 142606 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.541 142606 INFO oslo_service.service [-] Starting 1 workers
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.546 142606 DEBUG oslo_service.service [-] Started child 142717 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.549 142717 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-506844'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.551 142606 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpp12ig6ig/privsep.sock']
Jan 23 10:02:55 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:02:55 compute-2 ceph-mon[75771]: pgmap v279: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 46 KiB/s rd, 85 B/s wr, 77 op/s
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.571 142717 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.572 142717 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.572 142717 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.578 142717 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.584 142717 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 23 10:02:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.594 142717 INFO eventlet.wsgi.server [-] (142717) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Jan 23 10:02:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:55.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:56 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:56 compute-2 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 23 10:02:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:56 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:56.335 142606 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 23 10:02:56 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:56.336 142606 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpp12ig6ig/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 23 10:02:56 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:56.127 142723 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 23 10:02:56 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:56.134 142723 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 23 10:02:56 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:56.137 142723 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Jan 23 10:02:56 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:56.137 142723 INFO oslo.privsep.daemon [-] privsep daemon running as pid 142723
Jan 23 10:02:56 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:56.338 142723 DEBUG oslo.privsep.daemon [-] privsep: reply[528f047c-16e8-44a5-bcf0-49e980f2fbee]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:02:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:56.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:56 compute-2 ceph-mon[75771]: pgmap v280: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 51 KiB/s rd, 938 B/s wr, 83 op/s
Jan 23 10:02:56 compute-2 ceph-mon[75771]: pgmap v281: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 9.4 KiB/s rd, 938 B/s wr, 15 op/s
Jan 23 10:02:56 compute-2 ceph-mon[75771]: pgmap v282: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 9.7 KiB/s rd, 1023 B/s wr, 15 op/s
Jan 23 10:02:56 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:56.924 142723 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:02:56 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:56.924 142723 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:02:56 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:56.924 142723 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:02:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:57 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:57 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.685 142723 DEBUG oslo.privsep.daemon [-] privsep: reply[1578408f-6cca-4f8d-af7a-979fe0e002ea]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.687 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, column=external_ids, values=({'neutron:ovn-metadata-id': '52d0a7f1-ccfa-5fc4-b009-582563d38ee8'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.714 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.721 142606 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.721 142606 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.721 142606 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.722 142606 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.722 142606 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.722 142606 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.722 142606 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.722 142606 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.723 142606 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.723 142606 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.723 142606 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.723 142606 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.723 142606 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.723 142606 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.724 142606 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.724 142606 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.724 142606 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.724 142606 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.724 142606 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.724 142606 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.725 142606 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.725 142606 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.725 142606 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.725 142606 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.725 142606 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.726 142606 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.726 142606 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.726 142606 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.726 142606 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.726 142606 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.726 142606 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.727 142606 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.727 142606 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.727 142606 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.727 142606 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.727 142606 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.727 142606 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.728 142606 DEBUG oslo_service.service [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.728 142606 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.728 142606 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.728 142606 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.728 142606 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.728 142606 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.728 142606 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.729 142606 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.729 142606 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.729 142606 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.729 142606 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.729 142606 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.729 142606 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.729 142606 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.729 142606 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.729 142606 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.729 142606 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.730 142606 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.730 142606 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.730 142606 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.730 142606 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.730 142606 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.730 142606 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.730 142606 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.730 142606 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.730 142606 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.730 142606 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.731 142606 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.731 142606 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.731 142606 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.731 142606 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.731 142606 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.731 142606 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.731 142606 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.731 142606 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.731 142606 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.732 142606 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.732 142606 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.732 142606 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.732 142606 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.732 142606 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.732 142606 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.732 142606 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.733 142606 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.733 142606 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.733 142606 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.733 142606 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.733 142606 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.733 142606 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.733 142606 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.734 142606 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.734 142606 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.734 142606 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.734 142606 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.734 142606 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.734 142606 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.734 142606 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.735 142606 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.735 142606 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.735 142606 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.735 142606 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.735 142606 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.735 142606 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.735 142606 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.735 142606 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.736 142606 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.736 142606 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.736 142606 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.736 142606 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.736 142606 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.736 142606 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.737 142606 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.737 142606 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.737 142606 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.737 142606 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.737 142606 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.737 142606 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.738 142606 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.738 142606 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.738 142606 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.738 142606 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.738 142606 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.738 142606 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.739 142606 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.739 142606 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.739 142606 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.739 142606 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.739 142606 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.739 142606 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.740 142606 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.740 142606 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.740 142606 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.740 142606 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.740 142606 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.740 142606 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.741 142606 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.741 142606 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.741 142606 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.741 142606 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.741 142606 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.741 142606 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.742 142606 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.742 142606 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.742 142606 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.742 142606 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.742 142606 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.742 142606 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.742 142606 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.743 142606 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.743 142606 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.743 142606 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.743 142606 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.743 142606 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.743 142606 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.744 142606 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.744 142606 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.744 142606 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.744 142606 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.744 142606 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.744 142606 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.744 142606 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.745 142606 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.745 142606 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.745 142606 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.745 142606 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.745 142606 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.745 142606 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.745 142606 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.745 142606 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.746 142606 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.746 142606 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.746 142606 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.746 142606 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.746 142606 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.746 142606 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.746 142606 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.746 142606 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.747 142606 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.747 142606 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.747 142606 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.747 142606 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.747 142606 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.747 142606 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.747 142606 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.747 142606 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.748 142606 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.748 142606 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.748 142606 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.748 142606 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.748 142606 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.748 142606 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.748 142606 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.748 142606 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.749 142606 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.749 142606 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.749 142606 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.749 142606 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.749 142606 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.749 142606 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.749 142606 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.749 142606 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.749 142606 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.750 142606 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.750 142606 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.750 142606 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.750 142606 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.750 142606 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.750 142606 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.750 142606 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.750 142606 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.750 142606 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.750 142606 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.751 142606 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.751 142606 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.751 142606 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.751 142606 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.751 142606 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.751 142606 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.751 142606 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.751 142606 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.752 142606 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.752 142606 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.752 142606 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.752 142606 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.752 142606 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.752 142606 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.752 142606 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.752 142606 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.752 142606 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.753 142606 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.753 142606 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.753 142606 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.753 142606 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.753 142606 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.753 142606 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.753 142606 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.753 142606 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.754 142606 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.754 142606 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.754 142606 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.754 142606 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.754 142606 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.754 142606 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.754 142606 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.755 142606 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.755 142606 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.755 142606 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.755 142606 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.755 142606 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.755 142606 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.755 142606 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.755 142606 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.755 142606 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.755 142606 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.756 142606 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.756 142606 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.756 142606 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.756 142606 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.756 142606 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.756 142606 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.756 142606 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.756 142606 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.756 142606 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.757 142606 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.757 142606 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.757 142606 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.757 142606 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.757 142606 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.757 142606 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.757 142606 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.757 142606 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.757 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.758 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.758 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.758 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.758 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.758 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.758 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.758 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.758 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.758 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.758 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.759 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.759 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.759 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.759 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.759 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.759 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.759 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.759 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.759 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.759 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.760 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.760 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.760 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.760 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.760 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.760 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.760 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.760 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.761 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.761 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.761 142606 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.761 142606 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.761 142606 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.761 142606 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:57 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.761 142606 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 23 10:02:57 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:02:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:02:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:57.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:02:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:58 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9640023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:58.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:58 compute-2 sudo[142857]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:02:58 compute-2 sudo[142857]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:02:58 compute-2 sudo[142857]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:59 compute-2 python3.9[142856]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 23 10:02:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:59 compute-2 ceph-mon[75771]: pgmap v283: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 4.5 KiB/s rd, 938 B/s wr, 7 op/s
Jan 23 10:02:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:59 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:59 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:02:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:02:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:59.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:59 compute-2 sudo[143032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehxscdtigtsdbvshgdowfbjntvsiogmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162579.6608417-1425-117364402017054/AnsiballZ_stat.py'
Jan 23 10:02:59 compute-2 sudo[143032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:00 compute-2 python3.9[143034]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:03:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:00 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:00 compute-2 sudo[143032]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:00 compute-2 sudo[143158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zglcwrzfvsaqbvfhethvbmfohgtdwoge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162579.6608417-1425-117364402017054/AnsiballZ_copy.py'
Jan 23 10:03:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:00.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:00 compute-2 sudo[143158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:00 compute-2 python3.9[143160]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162579.6608417-1425-117364402017054/.source.yaml _original_basename=.vpvltry9 follow=False checksum=d88282ad6bcd11f7bd2cbc3f4703eb6122d6b05d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:00 compute-2 sudo[143158]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100301 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:03:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:01 compute-2 sshd-session[133855]: Connection closed by 192.168.122.30 port 43630
Jan 23 10:03:01 compute-2 sshd-session[133852]: pam_unix(sshd:session): session closed for user zuul
Jan 23 10:03:01 compute-2 systemd[1]: session-51.scope: Deactivated successfully.
Jan 23 10:03:01 compute-2 systemd[1]: session-51.scope: Consumed 1min 2.380s CPU time.
Jan 23 10:03:01 compute-2 systemd-logind[786]: Session 51 logged out. Waiting for processes to exit.
Jan 23 10:03:01 compute-2 systemd-logind[786]: Removed session 51.
Jan 23 10:03:01 compute-2 ceph-mon[75771]: pgmap v284: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 4.5 KiB/s rd, 938 B/s wr, 7 op/s
Jan 23 10:03:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:01 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9640023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:01 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:03:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:01.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:03:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:02 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:02.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:02 compute-2 ceph-mon[75771]: pgmap v285: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 4.6 KiB/s rd, 938 B/s wr, 7 op/s
Jan 23 10:03:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:02 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:03:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:03 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:03 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9640030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:03.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:04 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:04.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:05 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:05 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:05 compute-2 ceph-mon[75771]: pgmap v286: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:03:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:03:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:03:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:05.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:03:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:06 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:06.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:07 compute-2 ceph-mon[75771]: pgmap v287: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:03:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:07 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:07 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:07 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:03:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:03:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:07.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:03:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:08 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:08.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:09 compute-2 ceph-mon[75771]: pgmap v288: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:03:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:09 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:09 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:09.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:10 compute-2 sshd-session[143194]: Accepted publickey for zuul from 192.168.122.30 port 34802 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 10:03:10 compute-2 systemd-logind[786]: New session 52 of user zuul.
Jan 23 10:03:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:10 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:10 compute-2 systemd[1]: Started Session 52 of User zuul.
Jan 23 10:03:10 compute-2 sshd-session[143194]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 10:03:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:10.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:11 compute-2 python3.9[143348]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 10:03:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:11 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:11 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:11 compute-2 ceph-mon[75771]: pgmap v289: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:03:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:03:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:11.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:03:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:12 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:12 compute-2 sudo[143503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkkbxqymfceavdrirtuilnvolgtrhnbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162591.8333714-60-53416330042812/AnsiballZ_command.py'
Jan 23 10:03:12 compute-2 sudo[143503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:12 compute-2 python3.9[143505]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:03:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:12.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:12 compute-2 sudo[143503]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:12 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:03:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:13 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:13 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd964003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:13 compute-2 sudo[143622]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:03:13 compute-2 sudo[143622]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:03:13 compute-2 sudo[143622]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:13 compute-2 sudo[143714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ioqpuewicxdxntpbbliixwbklznfqiqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162592.9664865-92-246873894635768/AnsiballZ_systemd_service.py'
Jan 23 10:03:13 compute-2 sudo[143714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:13 compute-2 sudo[143678]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:03:13 compute-2 sudo[143678]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:03:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:13.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:13 compute-2 python3.9[143719]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 10:03:13 compute-2 systemd[1]: Reloading.
Jan 23 10:03:14 compute-2 systemd-sysv-generator[143781]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:03:14 compute-2 systemd-rc-local-generator[143777]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:03:14 compute-2 sudo[143678]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:14 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:14.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:14 compute-2 ceph-mon[75771]: pgmap v290: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:03:14 compute-2 sudo[143714]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:15 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:15 compute-2 python3.9[143941]: ansible-ansible.builtin.service_facts Invoked
Jan 23 10:03:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:15 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:15 compute-2 network[143958]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 10:03:15 compute-2 network[143959]: 'network-scripts' will be removed from distribution in near future.
Jan 23 10:03:15 compute-2 network[143960]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 10:03:15 compute-2 podman[143965]: 2026-01-23 10:03:15.66659435 +0000 UTC m=+0.105771040 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Jan 23 10:03:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:03:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:15.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:03:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:16 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd964003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:16.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:17 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:17 compute-2 ceph-mon[75771]: pgmap v291: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:17 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:03:17 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:03:17 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:03:17 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:03:17 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:03:17 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:03:17 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:03:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:17 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:17 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:03:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:17.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:18 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:18.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:18 compute-2 sudo[144096]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:03:18 compute-2 sudo[144096]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:03:18 compute-2 sudo[144096]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:19 compute-2 ceph-mon[75771]: pgmap v292: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:03:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:19 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:19 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:19.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:19 compute-2 ceph-mon[75771]: pgmap v293: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:20.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:21 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:03:21 compute-2 ceph-mon[75771]: pgmap v294: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:21 compute-2 sudo[144277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gobaccfqesyqttxyfzhdhuvvmerbwder ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162600.7637742-150-183165289752411/AnsiballZ_systemd_service.py'
Jan 23 10:03:21 compute-2 sudo[144277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:21 compute-2 python3.9[144279]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:03:21 compute-2 sudo[144277]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:21 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd964003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:21 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:21 compute-2 sudo[144430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agtanmsdpdqcbqprjilhoeclgphpnqus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162601.4862282-150-6716644886492/AnsiballZ_systemd_service.py'
Jan 23 10:03:21 compute-2 sudo[144430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:03:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:21.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:03:22 compute-2 python3.9[144432]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:03:22 compute-2 sudo[144430]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:22 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:03:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:22.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:03:22 compute-2 sudo[144585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxmggrnjxdbkmdzitiutfdjkhynndlbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162602.2337651-150-37857913996873/AnsiballZ_systemd_service.py'
Jan 23 10:03:22 compute-2 sudo[144585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:22 compute-2 ceph-mon[75771]: pgmap v295: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:03:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:22 compute-2 python3.9[144587]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:03:22 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:03:22 compute-2 sudo[144585]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:23 compute-2 sudo[144738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxxxqnpmmqsnzpcewaavbslhetqwwgol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162602.9825823-150-163888747871977/AnsiballZ_systemd_service.py'
Jan 23 10:03:23 compute-2 sudo[144738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:23 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:23 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:23 compute-2 python3.9[144740]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:03:23 compute-2 sudo[144738]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:23 compute-2 podman[144742]: 2026-01-23 10:03:23.634369468 +0000 UTC m=+0.056545166 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:03:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:23.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:23 compute-2 sudo[144913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxjaloxacjwourhvdcysaobnhkhytonm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162603.7402804-150-253888126043214/AnsiballZ_systemd_service.py'
Jan 23 10:03:23 compute-2 sudo[144913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:24 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd940001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:24 compute-2 python3.9[144915]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:03:24 compute-2 sudo[144913]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:24.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:24 compute-2 sudo[145067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytcygvnsijabzhsitgrktnmwrkwbpijc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162604.4375937-150-49726776969664/AnsiballZ_systemd_service.py'
Jan 23 10:03:24 compute-2 sudo[145067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:25 compute-2 python3.9[145069]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:03:25 compute-2 sudo[145067]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:25 compute-2 ceph-mon[75771]: pgmap v296: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:25 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:25 compute-2 sudo[145221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbkuhbauwtchhhnqxziuxclekyvqgmvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162605.1882505-150-39214395870054/AnsiballZ_systemd_service.py'
Jan 23 10:03:25 compute-2 sudo[145221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:25 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:25 compute-2 python3.9[145223]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:03:25 compute-2 sudo[145221]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:25.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:26 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd964004b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:26.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:26 compute-2 sudo[145251]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:03:26 compute-2 sudo[145251]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:03:26 compute-2 sudo[145251]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:27 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd964004b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:27 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:03:27 compute-2 ceph-mon[75771]: pgmap v297: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:03:27 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:03:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:27 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:27 compute-2 sudo[145401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnjplebvilybnqqnbevfdyemuzgumtic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162607.222094-306-186296068059861/AnsiballZ_file.py'
Jan 23 10:03:27 compute-2 sudo[145401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:27 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:03:27 compute-2 python3.9[145403]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:27 compute-2 sudo[145401]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:03:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:27.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:03:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:28 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:28 compute-2 sudo[145554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zaqlirmrrhbpnritxylxjleuyjprgtsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162608.0045047-306-175532433638819/AnsiballZ_file.py'
Jan 23 10:03:28 compute-2 sudo[145554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:28 compute-2 python3.9[145556]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:28 compute-2 sudo[145554]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:28.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:28 compute-2 sudo[145707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlxmpxjaaranubraolkaqphihngwhmyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162608.6102748-306-82859748456699/AnsiballZ_file.py'
Jan 23 10:03:28 compute-2 sudo[145707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:29 compute-2 python3.9[145709]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:29 compute-2 sudo[145707]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:29 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd964004b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:29 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd93c000d20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:29 compute-2 sudo[145859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsxfpychkqodpjoxambgwvyqwoiyndmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162609.2460864-306-95320336794384/AnsiballZ_file.py'
Jan 23 10:03:29 compute-2 sudo[145859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:29 compute-2 ceph-mon[75771]: pgmap v298: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:29 compute-2 python3.9[145861]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:29 compute-2 sudo[145859]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:03:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:29.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:03:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:30 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:30 compute-2 sudo[146012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgztokfnfqjvrwiyrnybovhmzirmkess ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162609.9040728-306-151693452596849/AnsiballZ_file.py'
Jan 23 10:03:30 compute-2 sudo[146012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:30.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:30 compute-2 python3.9[146014]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:30 compute-2 sudo[146012]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:30 compute-2 ceph-mon[75771]: pgmap v299: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:31 compute-2 sudo[146165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqkxbzoqoqlvsovlizaayrtosetmmcgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162610.768498-306-24122210371947/AnsiballZ_file.py'
Jan 23 10:03:31 compute-2 sudo[146165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:31 compute-2 python3.9[146167]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:31 compute-2 sudo[146165]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:31 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:31 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd964004b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:31 compute-2 sudo[146317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbygajftpjbfpxxgskvlglamifucirvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162611.418275-306-213777166116310/AnsiballZ_file.py'
Jan 23 10:03:31 compute-2 sudo[146317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:03:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:31.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:03:31 compute-2 python3.9[146319]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:31 compute-2 sudo[146317]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:32 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd93c001840 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:32.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:32 compute-2 sudo[146471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uofhalberlfzdqkiswjjegwxabobzupb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162612.4250388-456-99542147224316/AnsiballZ_file.py'
Jan 23 10:03:32 compute-2 sudo[146471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:32 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:03:32 compute-2 python3.9[146473]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:32 compute-2 sudo[146471]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:33 compute-2 ceph-mon[75771]: pgmap v300: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:03:33 compute-2 sudo[146623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcojslwslynjlaklhwnwgkbltxutbczh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162613.0279894-456-35306246517966/AnsiballZ_file.py'
Jan 23 10:03:33 compute-2 sudo[146623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:33 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd93c001840 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:33 compute-2 python3.9[146625]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:33 compute-2 sudo[146623]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:33 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd964004b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:33.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:33 compute-2 sudo[146775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcdywztqouqgdcnmdqnslergknlxwkfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162613.6262197-456-32844889225784/AnsiballZ_file.py'
Jan 23 10:03:33 compute-2 sudo[146775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:34 compute-2 python3.9[146778]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:34 compute-2 sudo[146775]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:34 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:03:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:34.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:03:34 compute-2 sudo[146929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfsykiehtxjetboajprgiecztamlhdbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162614.2505262-456-96357704180476/AnsiballZ_file.py'
Jan 23 10:03:34 compute-2 sudo[146929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:34 compute-2 python3.9[146931]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:34 compute-2 sudo[146929]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:35 compute-2 sudo[147081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxkdwhlnorflknkgaailgqnsluhwncvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162615.001917-456-91263469039794/AnsiballZ_file.py'
Jan 23 10:03:35 compute-2 sudo[147081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:35 compute-2 ceph-mon[75771]: pgmap v301: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:03:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:35 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004070 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:35 compute-2 python3.9[147083]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:35 compute-2 sudo[147081]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:35 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004070 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:03:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:35.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:03:35 compute-2 sudo[147233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqsifjycosjjisyfgsgtlpfxnczmrezb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162615.6184938-456-192953982981817/AnsiballZ_file.py'
Jan 23 10:03:35 compute-2 sudo[147233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:36 compute-2 python3.9[147235]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:36 compute-2 sudo[147233]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:36 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd964004b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:36 compute-2 sudo[147387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlcykuaefxbywxafzkhbltuchajttkjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162616.2460651-456-38004497617153/AnsiballZ_file.py'
Jan 23 10:03:36 compute-2 sudo[147387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:03:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:36.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:03:36 compute-2 python3.9[147389]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:36 compute-2 sudo[147387]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:37 compute-2 ceph-mon[75771]: pgmap v302: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:03:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:37 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:37 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004070 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:37 compute-2 sudo[147539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usmsfwrvzndtuyhkudlvgetmtmyvaieh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162617.2288306-609-176220211579551/AnsiballZ_command.py'
Jan 23 10:03:37 compute-2 sudo[147539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:37 compute-2 python3.9[147541]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:03:37 compute-2 sudo[147539]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:37 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:03:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:37.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:38 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd93c001840 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:38.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:38 compute-2 ceph-mon[75771]: pgmap v303: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:38 compute-2 python3.9[147695]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 10:03:39 compute-2 sudo[147720]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:03:39 compute-2 sudo[147720]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:03:39 compute-2 sudo[147720]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:39 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd93c001840 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:39 compute-2 sudo[147870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdhsjqyohrusktzpwivhlswrdzmsekaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162619.189607-663-168959813069467/AnsiballZ_systemd_service.py'
Jan 23 10:03:39 compute-2 sudo[147870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:39 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:39 compute-2 python3.9[147872]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 10:03:39 compute-2 systemd[1]: Reloading.
Jan 23 10:03:39 compute-2 systemd-rc-local-generator[147893]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:03:39 compute-2 systemd-sysv-generator[147901]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:03:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:39.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:40 compute-2 sudo[147870]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:40 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.002000048s ======
Jan 23 10:03:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:40.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000048s
Jan 23 10:03:40 compute-2 sudo[148058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxrdieottneqjcslrogaurxbzjjpskly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162620.361791-687-197543417857962/AnsiballZ_command.py'
Jan 23 10:03:40 compute-2 sudo[148058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:40 compute-2 python3.9[148060]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:03:40 compute-2 sudo[148058]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:41 compute-2 sudo[148211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohekdznzhwzllpuqkafvukrnhmxecpvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162621.0430803-687-268370428200180/AnsiballZ_command.py'
Jan 23 10:03:41 compute-2 sudo[148211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:41 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd93c001840 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:41 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd93c001840 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:41 compute-2 python3.9[148213]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:03:41 compute-2 sudo[148211]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:41 compute-2 ceph-mon[75771]: pgmap v304: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:03:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:41.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:03:41 compute-2 sudo[148367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekposjmipldfrxkpetvnmavhjtvksfqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162621.7306833-687-82117546409860/AnsiballZ_command.py'
Jan 23 10:03:42 compute-2 sudo[148367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:42 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9480040b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:42 compute-2 python3.9[148369]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:03:42 compute-2 sudo[148367]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:03:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:42.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:03:42 compute-2 sudo[148521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ziierntyglurcivripabqzwsqhuajayq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162622.367955-687-277651346294222/AnsiballZ_command.py'
Jan 23 10:03:42 compute-2 sudo[148521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:42 compute-2 ceph-mon[75771]: pgmap v305: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:03:42 compute-2 python3.9[148523]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:03:42 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:03:42 compute-2 sudo[148521]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:43 compute-2 sudo[148674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbsgkdhnftvrxwtbatywmerhevauwwpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162622.9931676-687-137174699526285/AnsiballZ_command.py'
Jan 23 10:03:43 compute-2 sudo[148674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:43 compute-2 python3.9[148676]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:03:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:43 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:43 compute-2 sudo[148674]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:43 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd940001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:43 compute-2 sudo[148827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euiiqlluyulzcefiktyqxpiajpnaphbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162623.6181376-687-151117625445708/AnsiballZ_command.py'
Jan 23 10:03:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:03:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:43.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:03:43 compute-2 sudo[148827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:44 compute-2 python3.9[148829]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:03:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:44 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd964004b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:44 compute-2 sudo[148827]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:44.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:44 compute-2 sudo[148982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsumrjqwmyrnliytthynwomzcspzgjef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162624.322968-687-221571639014069/AnsiballZ_command.py'
Jan 23 10:03:44 compute-2 sudo[148982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:44 compute-2 python3.9[148984]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:03:44 compute-2 sudo[148982]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:45 compute-2 ceph-mon[75771]: pgmap v306: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:45 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9480040d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:45 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003dd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:03:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:45.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:03:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:46 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd940001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:03:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:46.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:03:46 compute-2 podman[149012]: 2026-01-23 10:03:46.68523611 +0000 UTC m=+0.104280953 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 10:03:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:47 compute-2 sudo[149163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kabrkbnrogwhqdpjyvfjwwubgkdaaxbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162626.8769422-849-68220972853568/AnsiballZ_getent.py'
Jan 23 10:03:47 compute-2 sudo[149163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:47 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd964004b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:47 compute-2 python3.9[149165]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 23 10:03:47 compute-2 sudo[149163]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:47 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9480040f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:47 compute-2 ceph-mon[75771]: pgmap v307: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:03:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:47 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:03:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:03:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:47.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:03:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:48 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:48 compute-2 sudo[149317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdnknbjrsulzrioafwxktrumdegfxaqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162627.7692156-873-129545322873133/AnsiballZ_group.py'
Jan 23 10:03:48 compute-2 sudo[149317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:48 compute-2 python3.9[149319]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 10:03:48 compute-2 groupadd[149321]: group added to /etc/group: name=libvirt, GID=42473
Jan 23 10:03:48 compute-2 groupadd[149321]: group added to /etc/gshadow: name=libvirt
Jan 23 10:03:48 compute-2 groupadd[149321]: new group: name=libvirt, GID=42473
Jan 23 10:03:48 compute-2 sudo[149317]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:03:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:48.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:03:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:49 compute-2 sudo[149476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsrbtwroqycrugppuqquvvkycjffwcxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162628.8198416-896-171122572833934/AnsiballZ_user.py'
Jan 23 10:03:49 compute-2 sudo[149476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:49 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd940001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:49 compute-2 python3.9[149478]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 10:03:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:49 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd964004b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:49 compute-2 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 10:03:49 compute-2 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 10:03:49 compute-2 useradd[149480]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Jan 23 10:03:49 compute-2 ceph-mon[75771]: pgmap v308: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:49 compute-2 sudo[149476]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:03:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:49.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:03:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:50 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004110 fd 39 proxy ignored for local
Jan 23 10:03:50 compute-2 kernel: ganesha.nfsd[139642]: segfault at 50 ip 00007fd9e9f0e32e sp 00007fd9537fd210 error 4 in libntirpc.so.5.8[7fd9e9ef3000+2c000] likely on CPU 5 (core 0, socket 5)
Jan 23 10:03:50 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 10:03:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:50 compute-2 systemd[1]: Started Process Core Dump (PID 149536/UID 0).
Jan 23 10:03:50 compute-2 sudo[149641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzygmlklmwwurvfgtcneokezpvhqfyiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162630.1550567-929-212579256435073/AnsiballZ_setup.py'
Jan 23 10:03:50 compute-2 sudo[149641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:03:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:50.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:03:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:03:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:51 compute-2 python3.9[149643]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 10:03:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:51 compute-2 ceph-mon[75771]: pgmap v309: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:51 compute-2 sudo[149641]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:51.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:52 compute-2 sudo[149726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsnewehbslrxlftdewblwdpbsfwrqkvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162630.1550567-929-212579256435073/AnsiballZ_dnf.py'
Jan 23 10:03:52 compute-2 sudo[149726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:52.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:52 compute-2 ceph-mon[75771]: pgmap v310: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:03:52 compute-2 python3.9[149728]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 10:03:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:52 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:03:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:03:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:53.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:03:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:54 compute-2 systemd-coredump[149544]: Process 136835 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 52:
                                                    #0  0x00007fd9e9f0e32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    #1  0x0000000000000000 n/a (n/a + 0x0)
                                                    #2  0x00007fd9e9f18900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)
                                                    ELF object binary architecture: AMD x86-64
Jan 23 10:03:54 compute-2 systemd[1]: systemd-coredump@5-149536-0.service: Deactivated successfully.
Jan 23 10:03:54 compute-2 systemd[1]: systemd-coredump@5-149536-0.service: Consumed 4.061s CPU time.
Jan 23 10:03:54 compute-2 podman[149738]: 2026-01-23 10:03:54.407640161 +0000 UTC m=+0.026756057 container died 258224a89dca4f951aae15c4aabbdf8f7e57bade3608b553a304805da5c9f4f9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True)
Jan 23 10:03:54 compute-2 systemd[1]: var-lib-containers-storage-overlay-ce9718135be574e551ecb587204459d33867a504fdc3c71066c66db68a17e904-merged.mount: Deactivated successfully.
Jan 23 10:03:54 compute-2 podman[149738]: 2026-01-23 10:03:54.450211927 +0000 UTC m=+0.069327803 container remove 258224a89dca4f951aae15c4aabbdf8f7e57bade3608b553a304805da5c9f4f9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0)
Jan 23 10:03:54 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Main process exited, code=exited, status=139/n/a
Jan 23 10:03:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100354 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:03:54 compute-2 podman[149737]: 2026-01-23 10:03:54.484062507 +0000 UTC m=+0.090233146 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 23 10:03:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:54.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:54 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 10:03:54 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 2.207s CPU time.
Jan 23 10:03:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:55 compute-2 ceph-mon[75771]: pgmap v311: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:03:55.464 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:03:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:03:55.466 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:03:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:03:55.466 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:03:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:55.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:56.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:57 compute-2 ceph-mon[75771]: pgmap v312: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:57 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:03:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:57.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:03:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:58.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:03:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:59 compute-2 sudo[149841]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:03:59 compute-2 sudo[149841]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:03:59 compute-2 sudo[149841]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100359 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:03:59 compute-2 ceph-mon[75771]: pgmap v313: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Jan 23 10:03:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:03:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:03:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:59.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:04:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:00.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:04:00 compute-2 ceph-mon[75771]: pgmap v314: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Jan 23 10:04:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:01.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:02.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:02 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:04:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:03 compute-2 ceph-mon[75771]: pgmap v315: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:04:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:04:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:03.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:04:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:04.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:04 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Scheduled restart job, restart counter is at 6.
Jan 23 10:04:04 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:04:04 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 2.207s CPU time.
Jan 23 10:04:04 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 10:04:05 compute-2 podman[150059]: 2026-01-23 10:04:05.051139481 +0000 UTC m=+0.050533482 container create cc3ecaa5c9bd162608adda7e4b6d8c9bc9b9da535becdeae815c6ec28c0a8abf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, io.buildah.version=1.40.1, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:04:05 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a82868f9daeb7469a466e90c095fd894219075662e8ea6a1eab803fdf69c178b/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 10:04:05 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a82868f9daeb7469a466e90c095fd894219075662e8ea6a1eab803fdf69c178b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 10:04:05 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a82868f9daeb7469a466e90c095fd894219075662e8ea6a1eab803fdf69c178b/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:04:05 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a82868f9daeb7469a466e90c095fd894219075662e8ea6a1eab803fdf69c178b/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.tykohi-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:04:05 compute-2 podman[150059]: 2026-01-23 10:04:05.106281094 +0000 UTC m=+0.105675125 container init cc3ecaa5c9bd162608adda7e4b6d8c9bc9b9da535becdeae815c6ec28c0a8abf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:04:05 compute-2 podman[150059]: 2026-01-23 10:04:05.112817274 +0000 UTC m=+0.112211275 container start cc3ecaa5c9bd162608adda7e4b6d8c9bc9b9da535becdeae815c6ec28c0a8abf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 10:04:05 compute-2 bash[150059]: cc3ecaa5c9bd162608adda7e4b6d8c9bc9b9da535becdeae815c6ec28c0a8abf
Jan 23 10:04:05 compute-2 podman[150059]: 2026-01-23 10:04:05.031666453 +0000 UTC m=+0.031060454 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 10:04:05 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:04:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:05 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 10:04:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:05 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 10:04:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:05 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 10:04:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:05 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 10:04:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:05 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 10:04:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:05 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 10:04:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:05 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 10:04:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:05 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:04:05 compute-2 ceph-mon[75771]: pgmap v316: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:04:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:04:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:04:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:05.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:04:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:04:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:06.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:04:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:07 compute-2 ceph-mon[75771]: pgmap v317: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 682 B/s wr, 2 op/s
Jan 23 10:04:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:07 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:04:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:07.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:04:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:08.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:04:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:09 compute-2 ceph-mon[75771]: pgmap v318: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 682 B/s wr, 2 op/s
Jan 23 10:04:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:09.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:04:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:10.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:04:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:11 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Jan 23 10:04:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:11 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Jan 23 10:04:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:11 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:04:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:11 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:04:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:11 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:04:11 compute-2 ceph-mon[75771]: pgmap v319: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 682 B/s wr, 2 op/s
Jan 23 10:04:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:04:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:11.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:04:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:12 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:04:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:12 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:04:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:12 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:04:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:12 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:04:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:12 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:04:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:12 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:04:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:12 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:04:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:12.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:12 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:04:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:13 compute-2 ceph-mon[75771]: pgmap v320: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:04:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:04:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:13.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:04:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100414 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:04:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:14.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:15 compute-2 ceph-mon[75771]: pgmap v321: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.6 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:04:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:04:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:15.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:04:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:04:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:16.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:04:16 compute-2 ceph-mon[75771]: pgmap v322: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.6 KiB/s rd, 1.7 KiB/s wr, 5 op/s
Jan 23 10:04:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:17 compute-2 podman[150134]: 2026-01-23 10:04:17.689622376 +0000 UTC m=+0.107550681 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 10:04:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:17 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:04:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:17.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000013:nfs.cephfs.1: -2
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:04:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:04:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:18.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:04:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:19 compute-2 ceph-mon[75771]: pgmap v323: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1.1 KiB/s wr, 3 op/s
Jan 23 10:04:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:19 compute-2 kernel: SELinux:  Converting 2780 SID table entries...
Jan 23 10:04:19 compute-2 sudo[150178]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:04:19 compute-2 sudo[150178]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:04:19 compute-2 sudo[150178]: pam_unix(sudo:session): session closed for user root
Jan 23 10:04:19 compute-2 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 10:04:19 compute-2 kernel: SELinux:  policy capability open_perms=1
Jan 23 10:04:19 compute-2 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 10:04:19 compute-2 kernel: SELinux:  policy capability always_check_network=0
Jan 23 10:04:19 compute-2 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 10:04:19 compute-2 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 10:04:19 compute-2 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 10:04:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:19 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:19 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d00016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:19.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:04:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:20 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:20.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:21 compute-2 ceph-mon[75771]: pgmap v324: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1.1 KiB/s wr, 3 op/s
Jan 23 10:04:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:21 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100421 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:04:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:21 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:21.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:22 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:22.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:22 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:04:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:23 compute-2 ceph-mon[75771]: pgmap v325: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.6 KiB/s rd, 1.1 KiB/s wr, 3 op/s
Jan 23 10:04:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:23 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:23 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:23.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:24 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:24 compute-2 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Jan 23 10:04:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:04:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:24.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:04:24 compute-2 podman[150216]: 2026-01-23 10:04:24.657749014 +0000 UTC m=+0.064322259 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Jan 23 10:04:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:25 compute-2 ceph-mon[75771]: pgmap v326: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 767 B/s wr, 3 op/s
Jan 23 10:04:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:25 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:25 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:25.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:26 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:04:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:26.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:04:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:26 compute-2 sudo[150237]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:04:26 compute-2 sudo[150237]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:04:26 compute-2 sudo[150237]: pam_unix(sudo:session): session closed for user root
Jan 23 10:04:26 compute-2 sudo[150262]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 23 10:04:26 compute-2 sudo[150262]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:04:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:27 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:27 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:27 compute-2 ceph-mon[75771]: pgmap v327: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 767 B/s wr, 3 op/s
Jan 23 10:04:27 compute-2 podman[150359]: 2026-01-23 10:04:27.608657903 +0000 UTC m=+0.095727570 container exec 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:04:27 compute-2 podman[150359]: 2026-01-23 10:04:27.726074645 +0000 UTC m=+0.213144282 container exec_died 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid)
Jan 23 10:04:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:27 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:04:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:27.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:28 compute-2 podman[150457]: 2026-01-23 10:04:28.113314949 +0000 UTC m=+0.058977068 container exec 610246eaa4f552466bc8eee8cc806d516dfcdb85055d982f77b2276e24631ff0 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 10:04:28 compute-2 podman[150457]: 2026-01-23 10:04:28.13125299 +0000 UTC m=+0.076915099 container exec_died 610246eaa4f552466bc8eee8cc806d516dfcdb85055d982f77b2276e24631ff0 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 10:04:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:28 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:28 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 10:04:28 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 10:04:28 compute-2 podman[150561]: 2026-01-23 10:04:28.56394258 +0000 UTC m=+0.085919250 container exec cc3ecaa5c9bd162608adda7e4b6d8c9bc9b9da535becdeae815c6ec28c0a8abf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 10:04:28 compute-2 podman[150561]: 2026-01-23 10:04:28.599131523 +0000 UTC m=+0.121108153 container exec_died cc3ecaa5c9bd162608adda7e4b6d8c9bc9b9da535becdeae815c6ec28c0a8abf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1)
Jan 23 10:04:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:28.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:28 compute-2 podman[150627]: 2026-01-23 10:04:28.803456559 +0000 UTC m=+0.052756706 container exec c9fabe49ecac94420d698684f7f0ce6311c7531d91804b2e676e15edf98f3e26 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj)
Jan 23 10:04:28 compute-2 podman[150627]: 2026-01-23 10:04:28.816213222 +0000 UTC m=+0.065513359 container exec_died c9fabe49ecac94420d698684f7f0ce6311c7531d91804b2e676e15edf98f3e26 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj)
Jan 23 10:04:29 compute-2 podman[150694]: 2026-01-23 10:04:29.118306467 +0000 UTC m=+0.141699589 container exec 5f2fa996bf6c111f1884c2f59ff9b058d96f90c6e8935be1b7683b38269ec7aa (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai, architecture=x86_64, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, version=2.2.4, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.openshift.tags=Ceph keepalived)
Jan 23 10:04:29 compute-2 podman[150694]: 2026-01-23 10:04:29.134375131 +0000 UTC m=+0.157768233 container exec_died 5f2fa996bf6c111f1884c2f59ff9b058d96f90c6e8935be1b7683b38269ec7aa (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, io.openshift.expose-services=, release=1793, version=2.2.4, distribution-scope=public, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, vcs-type=git, name=keepalived, architecture=x86_64)
Jan 23 10:04:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:29 compute-2 sudo[150262]: pam_unix(sudo:session): session closed for user root
Jan 23 10:04:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:29 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:29 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:29 compute-2 sudo[150763]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:04:29 compute-2 sudo[150763]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:04:29 compute-2 sudo[150763]: pam_unix(sudo:session): session closed for user root
Jan 23 10:04:29 compute-2 ceph-mon[75771]: pgmap v328: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:04:29 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:04:29 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:04:29 compute-2 sudo[150788]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:04:29 compute-2 sudo[150788]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:04:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:29.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:30 compute-2 sudo[150788]: pam_unix(sudo:session): session closed for user root
Jan 23 10:04:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:30 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:30 compute-2 kernel: SELinux:  Converting 2780 SID table entries...
Jan 23 10:04:30 compute-2 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 10:04:30 compute-2 kernel: SELinux:  policy capability open_perms=1
Jan 23 10:04:30 compute-2 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 10:04:30 compute-2 kernel: SELinux:  policy capability always_check_network=0
Jan 23 10:04:30 compute-2 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 10:04:30 compute-2 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 10:04:30 compute-2 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 10:04:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:04:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:30.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:04:30 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 10:04:30 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:04:30 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:04:30 compute-2 ceph-mon[75771]: pgmap v329: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:04:30 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:04:30 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:04:30 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:04:30 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:04:30 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:04:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:31 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:31 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:31.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:32 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:32.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:32 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:04:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:33 compute-2 ceph-mon[75771]: pgmap v330: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:04:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:33 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:33 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:33.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:34 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:34.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:35 compute-2 ceph-mon[75771]: pgmap v331: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:04:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:04:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:35 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:35 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:35 compute-2 sudo[150857]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:04:35 compute-2 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 23 10:04:35 compute-2 sudo[150857]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:04:35 compute-2 sudo[150857]: pam_unix(sudo:session): session closed for user root
Jan 23 10:04:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:35.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:36 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:04:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:36.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:04:36 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:04:36 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:04:36 compute-2 ceph-mon[75771]: pgmap v332: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:04:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:37 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:37 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:37 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:04:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:04:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:37.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:04:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:38 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:38.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:39 compute-2 sudo[150886]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:04:39 compute-2 sudo[150886]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:04:39 compute-2 sudo[150886]: pam_unix(sudo:session): session closed for user root
Jan 23 10:04:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:39 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:39 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:39.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:40 compute-2 ceph-mon[75771]: pgmap v333: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:04:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:40 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:40.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:41 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:41 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:41.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:42 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:04:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:42.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:04:42 compute-2 ceph-mon[75771]: pgmap v334: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:04:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:42 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:04:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:43 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:43 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:43 compute-2 ceph-mon[75771]: pgmap v335: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:04:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:04:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:43.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:04:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:44 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:04:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:44.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:04:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:44 compute-2 ceph-mon[75771]: pgmap v336: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:04:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:45 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:45 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:45.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:46 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:04:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:46.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:04:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:47 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:47 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:47 compute-2 ceph-mon[75771]: pgmap v337: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:04:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:47 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:04:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:04:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:47.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:04:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:48 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:04:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:48.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:04:48 compute-2 podman[153723]: 2026-01-23 10:04:48.699057055 +0000 UTC m=+0.099987875 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 10:04:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:49 compute-2 ceph-mon[75771]: pgmap v338: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:04:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:49 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:49 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:49.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:04:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:50 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:50.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:51 compute-2 ceph-mon[75771]: pgmap v339: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:04:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:51 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:51 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:04:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:51.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:04:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:52 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:52.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:52 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:04:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:53 compute-2 ceph-mon[75771]: pgmap v340: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:04:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:53 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:53 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:04:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:53.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:04:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:54 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:54.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:04:55.466 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:04:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:04:55.467 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:04:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:04:55.467 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:04:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:55 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:55 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:55 compute-2 podman[158500]: 2026-01-23 10:04:55.629747235 +0000 UTC m=+0.048917022 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 23 10:04:55 compute-2 ceph-mon[75771]: pgmap v341: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:04:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:55.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:56 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:56.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:56 compute-2 ceph-mon[75771]: pgmap v342: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:04:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:57 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:57 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:57 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:04:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:57.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:58 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:04:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:58.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:04:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:59 compute-2 ceph-mon[75771]: pgmap v343: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:04:59 compute-2 sudo[161089]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:04:59 compute-2 sudo[161089]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:04:59 compute-2 sudo[161089]: pam_unix(sudo:session): session closed for user root
Jan 23 10:04:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:59 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:59 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:04:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:04:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:59.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:00 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:00.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:01 compute-2 ceph-mon[75771]: pgmap v344: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:05:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:01 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:01 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:01.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:02 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:02.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:02 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:05:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:03 compute-2 ceph-mon[75771]: pgmap v345: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:05:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:03 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:03 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:05:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:03.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:05:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:04 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:04.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:05 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:05 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:05:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:05.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:05:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:06 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:06 compute-2 ceph-mon[75771]: pgmap v346: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:05:06 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:05:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:05:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:06.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:05:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:07 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:07 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:07 compute-2 ceph-mon[75771]: pgmap v347: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:05:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:07 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:05:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:05:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:07.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:05:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:08 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:08 compute-2 ceph-mon[75771]: pgmap v348: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:05:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:05:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:08.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:05:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:09 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:09 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:05:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:09.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:05:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:10 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:10.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:10 compute-2 ceph-mon[75771]: pgmap v349: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:05:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:11 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:11 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:11.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:12 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:05:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:12.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:05:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:12 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:05:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:13 compute-2 ceph-mon[75771]: pgmap v350: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:05:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:13 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:13 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:05:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:13.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:05:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:14 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:05:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:14.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:05:14 compute-2 ceph-mon[75771]: pgmap v351: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:05:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:15 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:15 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:15.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:16 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:16.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:17 compute-2 ceph-mon[75771]: pgmap v352: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:05:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:17 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:17 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:17 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:05:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:18.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100518 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:05:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:18.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:19 compute-2 ceph-mon[75771]: pgmap v353: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:05:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:19 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:19 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:19 compute-2 sudo[167891]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:05:19 compute-2 sudo[167891]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:05:19 compute-2 sudo[167891]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:19 compute-2 podman[167907]: 2026-01-23 10:05:19.700797076 +0000 UTC m=+0.115047974 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 10:05:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:20.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:20 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:05:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:05:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:20.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:05:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:21 compute-2 ceph-mon[75771]: pgmap v354: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:05:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:21 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:21 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:22.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:22 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:22.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:23 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:05:23 compute-2 ceph-mon[75771]: pgmap v355: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:05:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:23 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:23 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:23 compute-2 kernel: SELinux:  Converting 2781 SID table entries...
Jan 23 10:05:23 compute-2 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 10:05:23 compute-2 kernel: SELinux:  policy capability open_perms=1
Jan 23 10:05:23 compute-2 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 10:05:23 compute-2 kernel: SELinux:  policy capability always_check_network=0
Jan 23 10:05:23 compute-2 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 10:05:23 compute-2 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 10:05:23 compute-2 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 10:05:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:24.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:24 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:24.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:24 compute-2 groupadd[167963]: group added to /etc/group: name=dnsmasq, GID=993
Jan 23 10:05:24 compute-2 groupadd[167963]: group added to /etc/gshadow: name=dnsmasq
Jan 23 10:05:25 compute-2 groupadd[167963]: new group: name=dnsmasq, GID=993
Jan 23 10:05:25 compute-2 useradd[167970]: new user: name=dnsmasq, UID=992, GID=993, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Jan 23 10:05:25 compute-2 dbus-broker-launch[744]: Noticed file-system modification, trigger reload.
Jan 23 10:05:25 compute-2 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 23 10:05:25 compute-2 dbus-broker-launch[744]: Noticed file-system modification, trigger reload.
Jan 23 10:05:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:25 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:25 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:25 compute-2 ceph-mon[75771]: pgmap v356: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:05:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:05:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:26.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:05:26 compute-2 groupadd[167985]: group added to /etc/group: name=clevis, GID=992
Jan 23 10:05:26 compute-2 groupadd[167985]: group added to /etc/gshadow: name=clevis
Jan 23 10:05:26 compute-2 groupadd[167985]: new group: name=clevis, GID=992
Jan 23 10:05:26 compute-2 podman[167984]: 2026-01-23 10:05:26.208236398 +0000 UTC m=+0.081703817 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 10:05:26 compute-2 useradd[168011]: new user: name=clevis, UID=991, GID=992, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Jan 23 10:05:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:26 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:26 compute-2 usermod[168021]: add 'clevis' to group 'tss'
Jan 23 10:05:26 compute-2 usermod[168021]: add 'clevis' to shadow group 'tss'
Jan 23 10:05:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:26.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:26 compute-2 ceph-mon[75771]: pgmap v357: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 596 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:05:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:26 : epoch 69734795 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:05:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:27 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:27 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:28.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:28 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:28 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:05:28 compute-2 polkitd[43445]: Reloading rules
Jan 23 10:05:28 compute-2 polkitd[43445]: Collecting garbage unconditionally...
Jan 23 10:05:28 compute-2 polkitd[43445]: Loading rules from directory /etc/polkit-1/rules.d
Jan 23 10:05:28 compute-2 polkitd[43445]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 23 10:05:28 compute-2 polkitd[43445]: Finished loading, compiling and executing 3 rules
Jan 23 10:05:28 compute-2 polkitd[43445]: Reloading rules
Jan 23 10:05:28 compute-2 polkitd[43445]: Collecting garbage unconditionally...
Jan 23 10:05:28 compute-2 polkitd[43445]: Loading rules from directory /etc/polkit-1/rules.d
Jan 23 10:05:28 compute-2 polkitd[43445]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 23 10:05:28 compute-2 polkitd[43445]: Finished loading, compiling and executing 3 rules
Jan 23 10:05:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:28.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:29 compute-2 ceph-mon[75771]: pgmap v358: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 596 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:05:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:29 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:29 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:29 compute-2 groupadd[168214]: group added to /etc/group: name=ceph, GID=167
Jan 23 10:05:29 compute-2 groupadd[168214]: group added to /etc/gshadow: name=ceph
Jan 23 10:05:29 compute-2 groupadd[168214]: new group: name=ceph, GID=167
Jan 23 10:05:29 compute-2 useradd[168220]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Jan 23 10:05:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:29 : epoch 69734795 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:05:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:29 : epoch 69734795 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:05:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:30.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:30 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:30 compute-2 ceph-mon[75771]: pgmap v359: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 596 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:05:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:05:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:30.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:05:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:31 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:31 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:32.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:32 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:32.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:32 compute-2 systemd[1]: Stopping OpenSSH server daemon...
Jan 23 10:05:32 compute-2 sshd[1005]: Received signal 15; terminating.
Jan 23 10:05:32 compute-2 systemd[1]: sshd.service: Deactivated successfully.
Jan 23 10:05:32 compute-2 systemd[1]: Stopped OpenSSH server daemon.
Jan 23 10:05:32 compute-2 systemd[1]: sshd.service: Consumed 2.134s CPU time, read 32.0K from disk, written 0B to disk.
Jan 23 10:05:32 compute-2 systemd[1]: Stopped target sshd-keygen.target.
Jan 23 10:05:32 compute-2 systemd[1]: Stopping sshd-keygen.target...
Jan 23 10:05:32 compute-2 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 10:05:32 compute-2 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 10:05:32 compute-2 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 10:05:32 compute-2 systemd[1]: Reached target sshd-keygen.target.
Jan 23 10:05:32 compute-2 systemd[1]: Starting OpenSSH server daemon...
Jan 23 10:05:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:32 compute-2 sshd[168889]: Server listening on 0.0.0.0 port 22.
Jan 23 10:05:32 compute-2 sshd[168889]: Server listening on :: port 22.
Jan 23 10:05:32 compute-2 systemd[1]: Started OpenSSH server daemon.
Jan 23 10:05:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:33 : epoch 69734795 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:05:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:33 compute-2 ceph-mon[75771]: pgmap v360: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:05:33 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:05:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:33 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:33 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:34.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:34 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:34 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 10:05:34 compute-2 systemd[1]: Starting man-db-cache-update.service...
Jan 23 10:05:34 compute-2 systemd[1]: Reloading.
Jan 23 10:05:34 compute-2 systemd-rc-local-generator[169148]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:05:34 compute-2 systemd-sysv-generator[169151]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:05:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:05:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:34.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:05:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:34 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 10:05:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:35 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:35 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:36.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:36 compute-2 sudo[170569]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:05:36 compute-2 sudo[170569]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:05:36 compute-2 sudo[170569]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:36 compute-2 sudo[170676]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:05:36 compute-2 sudo[170676]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:05:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:36 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:36 compute-2 sudo[170676]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:36.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:37 compute-2 ceph-mon[75771]: pgmap v361: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:05:37 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:05:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:37 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:37 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:37 compute-2 ceph-mon[75771]: pgmap v362: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:05:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:38.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:38 compute-2 sudo[149726]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:38 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:38 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:05:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:05:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:38.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:05:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:39 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:39 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:39 compute-2 sudo[174917]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:05:39 compute-2 sudo[174917]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:05:39 compute-2 sudo[174917]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:40.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:40 compute-2 ceph-mon[75771]: pgmap v363: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:05:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:40 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100540 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:05:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:40.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:41 compute-2 ceph-mon[75771]: pgmap v364: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:05:41 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:05:41 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:05:41 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:05:41 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:05:41 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:05:41 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:05:41 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:05:41 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:05:41 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:05:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:41 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:41 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:42.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:42 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 10:05:42 compute-2 systemd[1]: Finished man-db-cache-update.service.
Jan 23 10:05:42 compute-2 systemd[1]: man-db-cache-update.service: Consumed 9.706s CPU time.
Jan 23 10:05:42 compute-2 systemd[1]: run-r619068a26e7d43b6b7f70f53699058ee.service: Deactivated successfully.
Jan 23 10:05:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:42 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:42.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:43 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:05:43 compute-2 ceph-mon[75771]: pgmap v365: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:05:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:43 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003ca0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:43 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:44.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:44 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:05:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:44.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:05:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:45 compute-2 ceph-mon[75771]: pgmap v366: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:05:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:45 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:45 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:46 compute-2 sudo[177669]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:05:46 compute-2 sudo[177669]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:05:46 compute-2 sudo[177669]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:46.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:46 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:46.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:46 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:05:46 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:05:46 compute-2 ceph-mon[75771]: pgmap v367: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:05:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:47 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:47 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:05:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:48.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:05:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:48 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003ce0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:48 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:05:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:05:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:48.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:05:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:49 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:49 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:50.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:50 compute-2 ceph-mon[75771]: pgmap v368: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:05:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:05:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:50 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc001f90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:50 compute-2 podman[177699]: 2026-01-23 10:05:50.698615748 +0000 UTC m=+0.126256730 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:05:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:50.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:51 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:51 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:51 compute-2 ceph-mon[75771]: pgmap v369: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:05:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:05:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:52.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:05:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:52 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:52 compute-2 sudo[177853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzuluyydsrswjoapuaxwpyzoqbtlpiiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162752.059129-966-229580201876021/AnsiballZ_systemd.py'
Jan 23 10:05:52 compute-2 sudo[177853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:05:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:52.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:52 compute-2 python3.9[177855]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 10:05:52 compute-2 systemd[1]: Reloading.
Jan 23 10:05:53 compute-2 systemd-rc-local-generator[177882]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:05:53 compute-2 systemd-sysv-generator[177886]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:05:53 compute-2 ceph-mon[75771]: pgmap v370: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:05:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:53 compute-2 sudo[177853]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:53 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:05:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:53 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc001f90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:53 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:53 compute-2 sudo[178042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oynravltaddtteueqqcjpcjkzeejtmvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162753.5099463-966-146465282124772/AnsiballZ_systemd.py'
Jan 23 10:05:53 compute-2 sudo[178042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:05:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:54.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:54 compute-2 python3.9[178044]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 10:05:54 compute-2 systemd[1]: Reloading.
Jan 23 10:05:54 compute-2 systemd-rc-local-generator[178072]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:05:54 compute-2 systemd-sysv-generator[178076]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:05:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:54 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:54 compute-2 sudo[178042]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:54 compute-2 sudo[178234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxwtufotgzzpglmaodzrvpagtovwabvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162754.5279129-966-266996951400025/AnsiballZ_systemd.py'
Jan 23 10:05:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:05:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:54.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:05:54 compute-2 sudo[178234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:05:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:55 compute-2 python3.9[178236]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 10:05:55 compute-2 systemd[1]: Reloading.
Jan 23 10:05:55 compute-2 systemd-rc-local-generator[178264]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:05:55 compute-2 systemd-sysv-generator[178270]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:05:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:55 compute-2 ceph-mon[75771]: pgmap v371: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:05:55 compute-2 sudo[178234]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:05:55.467 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:05:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:05:55.469 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:05:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:05:55.469 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:05:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:55 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:55 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc001f90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:55 compute-2 sudo[178424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysdqdfjynttmqvzzezymvgvaipujoqpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162755.5717635-966-114231021166446/AnsiballZ_systemd.py'
Jan 23 10:05:55 compute-2 sudo[178424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:05:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:56.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:56 compute-2 python3.9[178426]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 10:05:56 compute-2 systemd[1]: Reloading.
Jan 23 10:05:56 compute-2 systemd-rc-local-generator[178456]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:05:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:56 compute-2 systemd-sysv-generator[178459]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:05:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:56 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003d20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:56 compute-2 sudo[178424]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:56 compute-2 podman[178467]: 2026-01-23 10:05:56.511607074 +0000 UTC m=+0.052308465 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 10:05:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:56.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:57 compute-2 ceph-mon[75771]: pgmap v372: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:05:57 compute-2 sudo[178636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucxcwwvidfecefvripibdlikotmmgkbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162757.1198828-1053-68482247939703/AnsiballZ_systemd.py'
Jan 23 10:05:57 compute-2 sudo[178636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:05:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:57 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003d20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:57 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003d20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:57 compute-2 python3.9[178638]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:05:57 compute-2 systemd[1]: Reloading.
Jan 23 10:05:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:57 compute-2 systemd-sysv-generator[178674]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:05:57 compute-2 systemd-rc-local-generator[178671]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:05:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:58.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:58 compute-2 sudo[178636]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:58 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:58 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:05:58 compute-2 sudo[178830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odbycdqjiwzaucjmdczfnycxflqgktom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162758.1861823-1053-119814457021842/AnsiballZ_systemd.py'
Jan 23 10:05:58 compute-2 sudo[178830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:05:58 compute-2 python3.9[178832]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:05:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:05:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:58.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:58 compute-2 systemd[1]: Reloading.
Jan 23 10:05:58 compute-2 systemd-rc-local-generator[178859]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:05:58 compute-2 systemd-sysv-generator[178865]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:05:59 compute-2 sudo[178830]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:59 compute-2 ceph-mon[75771]: pgmap v373: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:05:59 compute-2 sudo[179019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euuzrdcxrucoiuuycrrzwuqgggnmarzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162759.2684655-1053-246899800345311/AnsiballZ_systemd.py'
Jan 23 10:05:59 compute-2 sudo[179019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:05:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:59 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:59 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003d20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:59 compute-2 sudo[179022]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:05:59 compute-2 sudo[179022]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:05:59 compute-2 sudo[179022]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:05:59 compute-2 python3.9[179021]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:05:59 compute-2 systemd[1]: Reloading.
Jan 23 10:06:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:06:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:00.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:06:00 compute-2 systemd-rc-local-generator[179079]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:06:00 compute-2 systemd-sysv-generator[179083]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:06:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:00 compute-2 sudo[179019]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:00 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003d20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:00 compute-2 sudo[179236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhupzomzkqdtrdzhddmhhrdfdbikgepw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162760.379151-1053-269279309852253/AnsiballZ_systemd.py'
Jan 23 10:06:00 compute-2 sudo[179236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:06:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:00.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:06:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:00 compute-2 python3.9[179238]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:01 compute-2 sudo[179236]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:01 compute-2 ceph-mon[75771]: pgmap v374: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:06:01 compute-2 sudo[179391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvmplepofihkgfmdwmtugcvzwgppmqub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162761.138963-1053-196541375520813/AnsiballZ_systemd.py'
Jan 23 10:06:01 compute-2 sudo[179391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:01 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0001d70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:01 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:01 compute-2 python3.9[179393]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:01 compute-2 systemd[1]: Reloading.
Jan 23 10:06:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:01 compute-2 systemd-rc-local-generator[179422]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:06:01 compute-2 systemd-sysv-generator[179426]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:06:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:02.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:02 compute-2 sudo[179391]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:02 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:02.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:03 compute-2 sudo[179583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syirxsnsmnxwaxiewolherufpimwsate ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162762.8231225-1161-222152960256795/AnsiballZ_systemd.py'
Jan 23 10:06:03 compute-2 sudo[179583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100603 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:06:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:03 compute-2 python3.9[179585]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 10:06:03 compute-2 systemd[1]: Reloading.
Jan 23 10:06:03 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:06:03 compute-2 systemd-sysv-generator[179619]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:06:03 compute-2 systemd-rc-local-generator[179616]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:06:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:03 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:03 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0001d70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:03 compute-2 ceph-mon[75771]: pgmap v375: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:06:03 compute-2 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 23 10:06:03 compute-2 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 23 10:06:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:03 compute-2 sudo[179583]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:06:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:04.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:06:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:04 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003d20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:04 compute-2 sudo[179778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llkktkglztljrfcaeilcwnkzfblcckfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162764.0726683-1185-190610335067890/AnsiballZ_systemd.py'
Jan 23 10:06:04 compute-2 sudo[179778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:04 compute-2 python3.9[179780]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:04 compute-2 sudo[179778]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:06:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:04.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:06:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:05 compute-2 sudo[179933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwhwafdbnulvkwdtazfaxywcwclgedfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162764.8165097-1185-33799620893023/AnsiballZ_systemd.py'
Jan 23 10:06:05 compute-2 sudo[179933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:05 compute-2 python3.9[179935]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:05 compute-2 sudo[179933]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:05 compute-2 ceph-mon[75771]: pgmap v376: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:06:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:06:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:05 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:05 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003cf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:05 compute-2 sudo[180088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aefpphaqzdidlxplplshfpyzivwwvxpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162765.5720136-1185-92631781438642/AnsiballZ_systemd.py'
Jan 23 10:06:05 compute-2 sudo[180088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:06.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:06 compute-2 python3.9[180090]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:06 compute-2 sudo[180088]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:06 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0001d70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:06 compute-2 sudo[180245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixjhdihezvwtvyahbdhaiekkfxoapafe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162766.4183888-1185-228597544855271/AnsiballZ_systemd.py'
Jan 23 10:06:06 compute-2 sudo[180245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:06.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:07 compute-2 python3.9[180247]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:07 compute-2 sudo[180245]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:07 compute-2 sudo[180400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oajvlexfeoejphbgyabgfbtlhahsgzwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162767.2417505-1185-11358484533040/AnsiballZ_systemd.py'
Jan 23 10:06:07 compute-2 sudo[180400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:07 compute-2 ceph-mon[75771]: pgmap v377: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:06:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:07 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:07 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:07 compute-2 python3.9[180402]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:07 compute-2 sudo[180400]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:06:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:08.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:06:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:08 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:08 compute-2 sudo[180557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knyxbaaqoabfdjdyqgwfyqrkapvldzaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162768.0603743-1185-189526144721894/AnsiballZ_systemd.py'
Jan 23 10:06:08 compute-2 sudo[180557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:08 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:06:08 compute-2 python3.9[180559]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:08 compute-2 ceph-mon[75771]: pgmap v378: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:06:08 compute-2 sudo[180557]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:06:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:08.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:06:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:09 compute-2 sudo[180712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdxbqaxgkiftmoguoykozmkeucltzavz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162768.8285556-1185-78241895227797/AnsiballZ_systemd.py'
Jan 23 10:06:09 compute-2 sudo[180712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:09 compute-2 python3.9[180714]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:09 compute-2 sudo[180712]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:09 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0001d70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:09 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:09 compute-2 sudo[180867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngqcvrdnlvqahhcjszrihcfslnbkiwuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162769.581575-1185-186178656371856/AnsiballZ_systemd.py'
Jan 23 10:06:09 compute-2 sudo[180867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:06:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:10.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:06:10 compute-2 python3.9[180869]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:10 compute-2 sudo[180867]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:10 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:10 compute-2 sudo[181024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcwnwkufjdkilrghcceofggsvoflogrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162770.3557749-1185-224731938609894/AnsiballZ_systemd.py'
Jan 23 10:06:10 compute-2 sudo[181024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:06:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:10.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:06:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:10 compute-2 python3.9[181026]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:10 compute-2 sudo[181024]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:11 compute-2 sudo[181179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reudlzgwojirqvlrbvrscsskopjlupfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162771.0924518-1185-223165022837680/AnsiballZ_systemd.py'
Jan 23 10:06:11 compute-2 sudo[181179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:11 compute-2 ceph-mon[75771]: pgmap v379: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:06:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:11 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003d30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:11 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:11 compute-2 python3.9[181181]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:11 compute-2 sudo[181179]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:11 : epoch 69734795 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:06:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:06:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:12.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:06:12 compute-2 sudo[181335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oymhmwthwpzjaqinxfsrziutxfjpkuam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162771.8551853-1185-281215845575991/AnsiballZ_systemd.py'
Jan 23 10:06:12 compute-2 sudo[181335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:12 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:12 compute-2 python3.9[181337]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:12 compute-2 sudo[181335]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:06:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:12.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:06:12 compute-2 sudo[181491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snpqnzwaoqglrnpuoavcszvcqwhurdiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162772.5636103-1185-22473600065782/AnsiballZ_systemd.py'
Jan 23 10:06:12 compute-2 sudo[181491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:13 compute-2 python3.9[181493]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:13 compute-2 sudo[181491]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:13 compute-2 ceph-mon[75771]: pgmap v380: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:06:13 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:06:13 compute-2 sudo[181646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugzzmkqhgxernaiprdftvajjteavmkuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162773.3089561-1185-211268229618961/AnsiballZ_systemd.py'
Jan 23 10:06:13 compute-2 sudo[181646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:13 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:13 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:13 compute-2 python3.9[181648]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:13 compute-2 sudo[181646]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:06:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:14.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:06:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:14 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:14 compute-2 sudo[181803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvxzmiuwgvhllheavznsbnottuwgiiyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162774.113736-1185-215789411936608/AnsiballZ_systemd.py'
Jan 23 10:06:14 compute-2 sudo[181803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:14 compute-2 python3.9[181805]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:06:14 compute-2 sudo[181803]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:14.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:06:14 compute-2 ceph-mon[75771]: pgmap v381: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:06:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:14 : epoch 69734795 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:06:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:14 : epoch 69734795 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:06:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:15 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:15 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:16.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:16 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003d70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:16.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:17 compute-2 ceph-mon[75771]: pgmap v382: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:06:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:17 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:17 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:17 : epoch 69734795 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:06:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:18.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:18 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:06:18 compute-2 sudo[181962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlohwkxumprvlvmthkfwqyjvxqzsgmtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162778.420178-1490-140035805455936/AnsiballZ_file.py'
Jan 23 10:06:18 compute-2 sudo[181962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:18.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:18 compute-2 python3.9[181964]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:06:18 compute-2 sudo[181962]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:19 compute-2 sudo[182114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-riuyuglqybgnmbflpglragtjcbblgnvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162779.0144734-1490-41776638034983/AnsiballZ_file.py'
Jan 23 10:06:19 compute-2 sudo[182114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:19 compute-2 python3.9[182116]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:06:19 compute-2 sudo[182114]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:19 compute-2 ceph-mon[75771]: pgmap v383: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:06:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:19 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:19 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:19 compute-2 sudo[182272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgmysbcyydlkyybdghjmznvrpduffkrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162779.5932677-1490-95979333370747/AnsiballZ_file.py'
Jan 23 10:06:19 compute-2 sudo[182272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:19 compute-2 sudo[182261]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:06:19 compute-2 sudo[182261]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:06:19 compute-2 sudo[182261]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:20 compute-2 python3.9[182288]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:06:20 compute-2 sudo[182272]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:20.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:20 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:20 compute-2 sudo[182445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwacxdfppemgvxqmakrbctlfmfbfpfne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162780.1799629-1490-117143070161381/AnsiballZ_file.py'
Jan 23 10:06:20 compute-2 sudo[182445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:20 compute-2 python3.9[182447]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:06:20 compute-2 sudo[182445]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:06:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:06:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:20.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:06:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:21 compute-2 sudo[182612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plnmfizkbkkscaypwviempxakgmkbrey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162780.7560387-1490-256527840661984/AnsiballZ_file.py'
Jan 23 10:06:21 compute-2 sudo[182612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:21 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:21 compute-2 podman[182571]: 2026-01-23 10:06:21.609304246 +0000 UTC m=+0.405412012 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller)
Jan 23 10:06:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:21 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:21 compute-2 python3.9[182620]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:06:21 compute-2 sudo[182612]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:21 compute-2 ceph-mon[75771]: pgmap v384: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:06:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:22 compute-2 sudo[182776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbpnqkihvsqjooovbtogxsimqqkldvjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162781.8135276-1490-89217173840712/AnsiballZ_file.py'
Jan 23 10:06:22 compute-2 sudo[182776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:22.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:22 compute-2 python3.9[182778]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:06:22 compute-2 sudo[182776]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:22 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:06:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:22.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:06:22 compute-2 ceph-mon[75771]: pgmap v385: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:06:23 compute-2 python3.9[182929]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 10:06:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100623 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:06:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:23 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:06:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:23 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:23 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:23 compute-2 sudo[183079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npnxhpqkqmjiwrkqhjbiqhcwsmnovjxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162783.418961-1644-38909013991840/AnsiballZ_stat.py'
Jan 23 10:06:23 compute-2 sudo[183079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:24 compute-2 python3.9[183081]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:24 compute-2 sudo[183079]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:24.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:24 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:24 compute-2 sudo[183206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ketwzojphhwekujrpgyescxjtrsitznk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162783.418961-1644-38909013991840/AnsiballZ_copy.py'
Jan 23 10:06:24 compute-2 sudo[183206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:24 compute-2 python3.9[183208]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162783.418961-1644-38909013991840/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:24 compute-2 sudo[183206]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:06:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:24.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:06:25 compute-2 sudo[183358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuyybijhruczptbcrvsqlfvanneuasje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162784.9020076-1644-213166137515331/AnsiballZ_stat.py'
Jan 23 10:06:25 compute-2 sudo[183358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:25 compute-2 python3.9[183360]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:25 compute-2 sudo[183358]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:25 compute-2 ceph-mon[75771]: pgmap v386: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:06:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:25 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:25 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:25 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Jan 23 10:06:25 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:25.892141) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:06:25 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Jan 23 10:06:25 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162785892360, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 4241, "num_deletes": 502, "total_data_size": 11742776, "memory_usage": 11936904, "flush_reason": "Manual Compaction"}
Jan 23 10:06:25 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Jan 23 10:06:25 compute-2 sudo[183483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txoygjjudjmbwykixeuiyeptlncglbum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162784.9020076-1644-213166137515331/AnsiballZ_copy.py'
Jan 23 10:06:25 compute-2 sudo[183483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:25 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162785962585, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 4407138, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13108, "largest_seqno": 17344, "table_properties": {"data_size": 4395820, "index_size": 6404, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3845, "raw_key_size": 30531, "raw_average_key_size": 19, "raw_value_size": 4368991, "raw_average_value_size": 2857, "num_data_blocks": 279, "num_entries": 1529, "num_filter_entries": 1529, "num_deletions": 502, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162371, "oldest_key_time": 1769162371, "file_creation_time": 1769162785, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:06:25 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 70464 microseconds, and 14805 cpu microseconds.
Jan 23 10:06:25 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:06:25 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:25.962670) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 4407138 bytes OK
Jan 23 10:06:25 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:25.962705) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Jan 23 10:06:25 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:25.978398) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Jan 23 10:06:25 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:25.978466) EVENT_LOG_v1 {"time_micros": 1769162785978455, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:06:25 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:25.978501) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:06:25 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 11724047, prev total WAL file size 11724047, number of live WAL files 2.
Jan 23 10:06:25 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:06:25 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:25.981241) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323532' seq:72057594037927935, type:22 .. '6D67727374617400353034' seq:0, type:0; will stop at (end)
Jan 23 10:06:25 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:06:25 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(4303KB)], [27(12MB)]
Jan 23 10:06:25 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162785981425, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 17630701, "oldest_snapshot_seqno": -1}
Jan 23 10:06:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:26.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:26 compute-2 python3.9[183486]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162784.9020076-1644-213166137515331/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:26 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 4980 keys, 13174161 bytes, temperature: kUnknown
Jan 23 10:06:26 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162786148686, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 13174161, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13139094, "index_size": 21517, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12485, "raw_key_size": 124859, "raw_average_key_size": 25, "raw_value_size": 13046841, "raw_average_value_size": 2619, "num_data_blocks": 899, "num_entries": 4980, "num_filter_entries": 4980, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769162785, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:06:26 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:06:26 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:26.149125) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 13174161 bytes
Jan 23 10:06:26 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:26.162842) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 105.3 rd, 78.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.2, 12.6 +0.0 blob) out(12.6 +0.0 blob), read-write-amplify(7.0) write-amplify(3.0) OK, records in: 5808, records dropped: 828 output_compression: NoCompression
Jan 23 10:06:26 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:26.162892) EVENT_LOG_v1 {"time_micros": 1769162786162873, "job": 14, "event": "compaction_finished", "compaction_time_micros": 167403, "compaction_time_cpu_micros": 43582, "output_level": 6, "num_output_files": 1, "total_output_size": 13174161, "num_input_records": 5808, "num_output_records": 4980, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:06:26 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:06:26 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162786163716, "job": 14, "event": "table_file_deletion", "file_number": 29}
Jan 23 10:06:26 compute-2 sudo[183483]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:26 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:06:26 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162786166084, "job": 14, "event": "table_file_deletion", "file_number": 27}
Jan 23 10:06:26 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:25.981058) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:06:26 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:26.166120) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:06:26 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:26.166124) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:06:26 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:26.166125) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:06:26 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:26.166126) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:06:26 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:26.166128) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:06:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:26 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0004080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:26 compute-2 sudo[183637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbbheqjwajobeilpxnojublccpviwugt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162786.2812843-1644-28256511466840/AnsiballZ_stat.py'
Jan 23 10:06:26 compute-2 sudo[183637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:26 compute-2 podman[183639]: 2026-01-23 10:06:26.631627408 +0000 UTC m=+0.052609967 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:06:26 compute-2 python3.9[183640]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:26 compute-2 sudo[183637]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:26.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:26 compute-2 ceph-mon[75771]: pgmap v387: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:06:27 compute-2 sudo[183782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cojliebxjrazftsapxehvhpwseduqyaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162786.2812843-1644-28256511466840/AnsiballZ_copy.py'
Jan 23 10:06:27 compute-2 sudo[183782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:27 compute-2 python3.9[183784]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162786.2812843-1644-28256511466840/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:27 compute-2 sudo[183782]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:27 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:27 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:27 compute-2 sudo[183934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osigrarwwdgadiymhbopdlmgazmhpawz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162787.4281337-1644-80179343992359/AnsiballZ_stat.py'
Jan 23 10:06:27 compute-2 sudo[183934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:27 compute-2 python3.9[183936]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:27 compute-2 sudo[183934]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:06:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:28.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:06:28 compute-2 sudo[184060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wokahfigofhgqvzeoxlbcfwwqkntuegy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162787.4281337-1644-80179343992359/AnsiballZ_copy.py'
Jan 23 10:06:28 compute-2 sudo[184060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:28 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:28 compute-2 python3.9[184062]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162787.4281337-1644-80179343992359/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:28 compute-2 sudo[184060]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:28 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:06:28 compute-2 sudo[184213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nczwqsyznxddsaiotbdlpzmquhezmtrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162788.5212467-1644-18048058307373/AnsiballZ_stat.py'
Jan 23 10:06:28 compute-2 sudo[184213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:28.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:28 compute-2 python3.9[184215]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:29 compute-2 sudo[184213]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:29 compute-2 ceph-mon[75771]: pgmap v388: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:06:29 compute-2 sudo[184338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reywdivcqwblritsvkybmnseziyhvzpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162788.5212467-1644-18048058307373/AnsiballZ_copy.py'
Jan 23 10:06:29 compute-2 sudo[184338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:29 compute-2 python3.9[184340]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162788.5212467-1644-18048058307373/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:29 compute-2 sudo[184338]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:29 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:29 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:30.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:30 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:30 compute-2 sudo[184495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvgmajwezwsdmwivxssgddnrdrsgrgup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162789.8751047-1644-275273793834793/AnsiballZ_stat.py'
Jan 23 10:06:30 compute-2 sudo[184495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:30 compute-2 python3.9[184497]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:30 compute-2 sudo[184495]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:30.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:31 compute-2 sudo[184620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uybrktojywtzwtnnrgoiwuxzotstjare ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162789.8751047-1644-275273793834793/AnsiballZ_copy.py'
Jan 23 10:06:31 compute-2 sudo[184620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:31 compute-2 python3.9[184622]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162789.8751047-1644-275273793834793/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:31 compute-2 ceph-mon[75771]: pgmap v389: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:06:31 compute-2 sudo[184620]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:31 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:31 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:31 compute-2 sudo[184772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqwtrssisjccwpujfucvjbmscwhxaedd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162791.4486651-1644-105620540118285/AnsiballZ_stat.py'
Jan 23 10:06:31 compute-2 sudo[184772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:31 compute-2 python3.9[184774]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:31 compute-2 sudo[184772]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:32.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:32 compute-2 sudo[184896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkvginxwsmhstktohgbgcfhwdzppxnwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162791.4486651-1644-105620540118285/AnsiballZ_copy.py'
Jan 23 10:06:32 compute-2 sudo[184896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:32 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8004340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:32 compute-2 python3.9[184898]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162791.4486651-1644-105620540118285/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:32 compute-2 sudo[184896]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:32 compute-2 sudo[185049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpztpvnwdfftleneljhbqcczmjihieyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162792.5588086-1644-143105795448352/AnsiballZ_stat.py'
Jan 23 10:06:32 compute-2 sudo[185049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:06:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:32.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:06:33 compute-2 python3.9[185051]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:33 compute-2 sudo[185049]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:33 compute-2 sudo[185174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsgbrwoxvfibnatcvtsvhpwfamjzifqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162792.5588086-1644-143105795448352/AnsiballZ_copy.py'
Jan 23 10:06:33 compute-2 sudo[185174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:33 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:06:33 compute-2 python3.9[185176]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162792.5588086-1644-143105795448352/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:33 compute-2 ceph-mon[75771]: pgmap v390: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:06:33 compute-2 sudo[185174]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:33 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:33 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:34.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:34 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:06:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:34.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:06:34 compute-2 sudo[185328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmzpgngdzbjwpqjfoltjkwgsbvywlirn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162794.5647407-1983-198714845553666/AnsiballZ_command.py'
Jan 23 10:06:34 compute-2 sudo[185328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:35 compute-2 python3.9[185330]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 23 10:06:35 compute-2 sudo[185328]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:35 compute-2 ceph-mon[75771]: pgmap v391: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:06:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:06:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:35 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:35 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8004340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:35 compute-2 sudo[185481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdwxgnwddmfvuowoqddybhjnqmleuklo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162795.592157-2009-148695944456538/AnsiballZ_file.py'
Jan 23 10:06:35 compute-2 sudo[185481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:36 compute-2 python3.9[185483]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:36 compute-2 sudo[185481]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:36.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:36 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:36 compute-2 sudo[185635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjlapsjghlshzullecgrjupwbxboolpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162796.2094479-2009-117805375382167/AnsiballZ_file.py'
Jan 23 10:06:36 compute-2 sudo[185635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:36 compute-2 python3.9[185637]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:36 compute-2 sudo[185635]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:36.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:37 compute-2 sudo[185787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwsxssvdpdpwfxnfdlyzpztgrtkgdqme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162796.7823703-2009-60392969513833/AnsiballZ_file.py'
Jan 23 10:06:37 compute-2 sudo[185787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:37 compute-2 python3.9[185789]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:37 compute-2 sudo[185787]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:37 compute-2 sudo[185939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouzjklzaphhnvohijxmzfgtiwirgeytr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162797.3396466-2009-158664495851570/AnsiballZ_file.py'
Jan 23 10:06:37 compute-2 sudo[185939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:37 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:37 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:37 compute-2 python3.9[185941]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:37 compute-2 sudo[185939]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:38.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:38 compute-2 sudo[186092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ueomkuihvteyuprasgcdmluxthhmgule ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162797.8736584-2009-77832819151863/AnsiballZ_file.py'
Jan 23 10:06:38 compute-2 sudo[186092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:38 compute-2 ceph-mon[75771]: pgmap v392: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:06:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:38 compute-2 python3.9[186094]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:38 compute-2 sudo[186092]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:38 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8004340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:38 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:06:38 compute-2 sudo[186245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jibubisuozmfrsogbfgebkcwtwrsqqhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162798.4255927-2009-108164910985536/AnsiballZ_file.py'
Jan 23 10:06:38 compute-2 sudo[186245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:38 compute-2 ceph-mon[75771]: pgmap v393: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:06:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:38.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:38 compute-2 python3.9[186247]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:38 compute-2 sudo[186245]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:39 compute-2 sudo[186397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbzyxdgodxevkhqjoptguzayhoqolmvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162799.009143-2009-185732096521058/AnsiballZ_file.py'
Jan 23 10:06:39 compute-2 sudo[186397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:39 compute-2 python3.9[186399]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:39 compute-2 sudo[186397]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:39 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:39 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:39 compute-2 sudo[186549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqlwtxoamxlxgwyckldnlezsgzgukdju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162799.5544355-2009-4301173330650/AnsiballZ_file.py'
Jan 23 10:06:39 compute-2 sudo[186549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:39 compute-2 sudo[186552]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:06:39 compute-2 sudo[186552]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:06:39 compute-2 sudo[186552]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:40 compute-2 python3.9[186551]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:40 compute-2 sudo[186549]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:40.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:40 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8001f70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:40 compute-2 sudo[186728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxlujlukrxjummunusptkyrhojvvsajg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162800.173671-2009-268453297590051/AnsiballZ_file.py'
Jan 23 10:06:40 compute-2 sudo[186728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:40 compute-2 python3.9[186730]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:40 compute-2 sudo[186728]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:06:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:40.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:06:41 compute-2 sudo[186880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgskxulyzvhvnioonfkvfdolwxzbppzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162800.7679121-2009-161761160857339/AnsiballZ_file.py'
Jan 23 10:06:41 compute-2 sudo[186880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:41 compute-2 python3.9[186882]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:41 compute-2 sudo[186880]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:41 compute-2 ceph-mon[75771]: pgmap v394: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:06:41 compute-2 sudo[187032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrjnvcnomgsuklvwifnqrnukkfgjfslq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162801.3091063-2009-215915054794782/AnsiballZ_file.py'
Jan 23 10:06:41 compute-2 sudo[187032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:41 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8004340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:41 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8004340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:41 compute-2 python3.9[187034]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:41 compute-2 sudo[187032]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:42.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:42 compute-2 sudo[187185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pngdimmkasdzqeqkzwlyctepvhrzyflr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162801.886061-2009-45302147566184/AnsiballZ_file.py'
Jan 23 10:06:42 compute-2 sudo[187185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:42 compute-2 python3.9[187187]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:42 compute-2 sudo[187185]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:42 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:42 compute-2 sudo[187338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auhanyscvujijpvmygwiwfjwhbqcdvvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162802.4283426-2009-233422745117621/AnsiballZ_file.py'
Jan 23 10:06:42 compute-2 sudo[187338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:06:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:42.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:06:42 compute-2 python3.9[187340]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:42 compute-2 sudo[187338]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:43 compute-2 ceph-mon[75771]: pgmap v395: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:06:43 compute-2 sudo[187490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frgrhllsgasfekaplmnpgukcbvjwbawz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162803.232359-2009-198752080177932/AnsiballZ_file.py'
Jan 23 10:06:43 compute-2 sudo[187490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:43 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:06:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:43 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8001f70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:43 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy ignored for local
Jan 23 10:06:43 compute-2 kernel: ganesha.nfsd[167951]: segfault at 50 ip 00007fd168b5932e sp 00007fd0e57f9210 error 4 in libntirpc.so.5.8[7fd168b3e000+2c000] likely on CPU 5 (core 0, socket 5)
Jan 23 10:06:43 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 10:06:43 compute-2 systemd[1]: Started Process Core Dump (PID 187493/UID 0).
Jan 23 10:06:43 compute-2 python3.9[187492]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:43 compute-2 sudo[187490]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:44.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:44.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:45 compute-2 systemd-coredump[187494]: Process 150078 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 59:
                                                    #0  0x00007fd168b5932e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Jan 23 10:06:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:45 compute-2 systemd[1]: systemd-coredump@6-187493-0.service: Deactivated successfully.
Jan 23 10:06:45 compute-2 systemd[1]: systemd-coredump@6-187493-0.service: Consumed 1.518s CPU time.
Jan 23 10:06:45 compute-2 podman[187525]: 2026-01-23 10:06:45.295332151 +0000 UTC m=+0.026159171 container died cc3ecaa5c9bd162608adda7e4b6d8c9bc9b9da535becdeae815c6ec28c0a8abf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 23 10:06:45 compute-2 systemd[1]: var-lib-containers-storage-overlay-a82868f9daeb7469a466e90c095fd894219075662e8ea6a1eab803fdf69c178b-merged.mount: Deactivated successfully.
Jan 23 10:06:45 compute-2 podman[187525]: 2026-01-23 10:06:45.3360182 +0000 UTC m=+0.066845200 container remove cc3ecaa5c9bd162608adda7e4b6d8c9bc9b9da535becdeae815c6ec28c0a8abf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 10:06:45 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Main process exited, code=exited, status=139/n/a
Jan 23 10:06:45 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 10:06:45 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 2.052s CPU time.
Jan 23 10:06:45 compute-2 ceph-mon[75771]: pgmap v396: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:06:45 compute-2 sudo[187693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tprkmibnvygcbopeesifyivedhdfmbpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162805.30712-2307-86417166662119/AnsiballZ_stat.py'
Jan 23 10:06:45 compute-2 sudo[187693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:45 compute-2 python3.9[187695]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:45 compute-2 sudo[187693]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:46 compute-2 sudo[187817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzieowdaokxvxgqyyhlxqlwkentdmegl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162805.30712-2307-86417166662119/AnsiballZ_copy.py'
Jan 23 10:06:46 compute-2 sudo[187817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:46.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:46 compute-2 sudo[187820]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:06:46 compute-2 sudo[187820]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:06:46 compute-2 sudo[187820]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:46 compute-2 sudo[187845]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:06:46 compute-2 sudo[187845]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:06:46 compute-2 python3.9[187819]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162805.30712-2307-86417166662119/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:46 compute-2 sudo[187817]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:46 compute-2 sudo[188049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujevtrytzwfjibbhurxriwxgtbpzkvlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162806.4120848-2307-10540368233366/AnsiballZ_stat.py'
Jan 23 10:06:46 compute-2 sudo[188049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:46 compute-2 sudo[187845]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:46 compute-2 python3.9[188051]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:46.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:46 compute-2 sudo[188049]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:47 compute-2 sudo[188172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwttjndpsycmoisltgiqrijaqvalvepn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162806.4120848-2307-10540368233366/AnsiballZ_copy.py'
Jan 23 10:06:47 compute-2 sudo[188172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:47 compute-2 python3.9[188174]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162806.4120848-2307-10540368233366/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:47 compute-2 sudo[188172]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:47 compute-2 ceph-mon[75771]: pgmap v397: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:06:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:47 compute-2 sudo[188324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrcekaquqezzectqlmxczegnahkqazjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162807.6564941-2307-131628813116022/AnsiballZ_stat.py'
Jan 23 10:06:47 compute-2 sudo[188324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:48.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:48 compute-2 python3.9[188326]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:48 compute-2 sudo[188324]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:48 compute-2 sudo[188449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnkfmublltdkbeimyiajxpykeeyrocwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162807.6564941-2307-131628813116022/AnsiballZ_copy.py'
Jan 23 10:06:48 compute-2 sudo[188449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:48 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:06:48 compute-2 python3.9[188451]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162807.6564941-2307-131628813116022/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:48 compute-2 sudo[188449]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:06:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:48.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:06:49 compute-2 sudo[188601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnndwdowhwidlakwtbhzastpuingkwoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162808.7409387-2307-120372273617361/AnsiballZ_stat.py'
Jan 23 10:06:49 compute-2 sudo[188601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:49 compute-2 python3.9[188603]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:49 compute-2 sudo[188601]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:49 compute-2 sudo[188724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reuhwgriyzbawdqqseqqvggokmjtmyoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162808.7409387-2307-120372273617361/AnsiballZ_copy.py'
Jan 23 10:06:49 compute-2 sudo[188724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:49 compute-2 ceph-mon[75771]: pgmap v398: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:06:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100649 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:06:49 compute-2 python3.9[188726]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162808.7409387-2307-120372273617361/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:49 compute-2 sudo[188724]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:50 compute-2 sudo[188877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwrowvhjixiwhoucubsyttchoadjpixz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162809.8273191-2307-88197046777382/AnsiballZ_stat.py'
Jan 23 10:06:50 compute-2 sudo[188877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:50.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:50 compute-2 python3.9[188879]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:50 compute-2 sudo[188877]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:50 compute-2 sudo[189001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnwswciytfdexcemjfigusuuzvsdqrrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162809.8273191-2307-88197046777382/AnsiballZ_copy.py'
Jan 23 10:06:50 compute-2 sudo[189001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:50 compute-2 python3.9[189003]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162809.8273191-2307-88197046777382/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:50 compute-2 sudo[189001]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:06:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:06:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:06:50 compute-2 ceph-mon[75771]: pgmap v399: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:06:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:06:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:06:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:06:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:06:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:06:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:06:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:06:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:06:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:50.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:06:51 compute-2 sudo[189153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwrhyrluuxapgkendyclnujpxckjqrca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162810.8301108-2307-256139144719868/AnsiballZ_stat.py'
Jan 23 10:06:51 compute-2 sudo[189153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:51 compute-2 python3.9[189155]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:51 compute-2 sudo[189153]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:51 compute-2 sudo[189276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esyighbyqlgmsmfhpwtocmrvcgbkwvig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162810.8301108-2307-256139144719868/AnsiballZ_copy.py'
Jan 23 10:06:51 compute-2 sudo[189276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:51 compute-2 python3.9[189278]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162810.8301108-2307-256139144719868/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:51 compute-2 sudo[189276]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:06:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:52.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:06:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:52 compute-2 sudo[189442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvncvjxqxgwufvzxniozhhfonvwwvieu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162812.0001886-2307-275382021497293/AnsiballZ_stat.py'
Jan 23 10:06:52 compute-2 sudo[189442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:52 compute-2 podman[189403]: 2026-01-23 10:06:52.329412572 +0000 UTC m=+0.083780689 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 10:06:52 compute-2 python3.9[189450]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:52 compute-2 sudo[189442]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100652 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:06:52 compute-2 sudo[189579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjuummkczfnwhpfdloeziunauongmqyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162812.0001886-2307-275382021497293/AnsiballZ_copy.py'
Jan 23 10:06:52 compute-2 sudo[189579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:06:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:52.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:06:52 compute-2 python3.9[189581]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162812.0001886-2307-275382021497293/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:53 compute-2 sudo[189579]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:53 compute-2 ceph-mon[75771]: pgmap v400: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:06:53 compute-2 sudo[189731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-seoopguqgqlhuwllrdkxbnabvprrantm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162813.1640015-2307-82976802028080/AnsiballZ_stat.py'
Jan 23 10:06:53 compute-2 sudo[189731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:53 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:06:53 compute-2 python3.9[189733]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:53 compute-2 sudo[189731]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:53 compute-2 sudo[189855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhdtxhzlgskpxstlkspxyuyvqngwdkdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162813.1640015-2307-82976802028080/AnsiballZ_copy.py'
Jan 23 10:06:53 compute-2 sudo[189855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:54.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:54 compute-2 python3.9[189857]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162813.1640015-2307-82976802028080/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:54 compute-2 sudo[189855]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:54 compute-2 sudo[190008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfofevrxztmiwadjlmxjcefcifxvaddh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162814.2811522-2307-268013191509008/AnsiballZ_stat.py'
Jan 23 10:06:54 compute-2 sudo[190008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:54 compute-2 python3.9[190010]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:54 compute-2 sudo[190008]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:06:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:54.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:06:55 compute-2 sudo[190131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xckjbstjwzhqosmrwqduwizbyqmdleuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162814.2811522-2307-268013191509008/AnsiballZ_copy.py'
Jan 23 10:06:55 compute-2 sudo[190131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:55 compute-2 python3.9[190133]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162814.2811522-2307-268013191509008/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:55 compute-2 sudo[190131]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:06:55.468 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:06:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:06:55.469 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:06:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:06:55.469 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:06:55 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Scheduled restart job, restart counter is at 7.
Jan 23 10:06:55 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:06:55 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 2.052s CPU time.
Jan 23 10:06:55 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 10:06:55 compute-2 sudo[190318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prawwbrzhsrcncmzhptiimtjexffosrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162815.4426026-2307-106569025354388/AnsiballZ_stat.py'
Jan 23 10:06:55 compute-2 sudo[190318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:55 compute-2 podman[190331]: 2026-01-23 10:06:55.815786652 +0000 UTC m=+0.027075193 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 10:06:55 compute-2 ceph-mon[75771]: pgmap v401: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Jan 23 10:06:55 compute-2 podman[190331]: 2026-01-23 10:06:55.932811436 +0000 UTC m=+0.144099967 container create cc70552ffcc96627532b5a08d41512b300ebec5bdbe07c25e585b097491a9291 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 10:06:55 compute-2 python3.9[190326]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:55 compute-2 sudo[190318]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:56 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f9d26f84440f7e3bde72f9674f205f83ca47255b101bf5827a1c2646cf3b58f/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 10:06:56 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f9d26f84440f7e3bde72f9674f205f83ca47255b101bf5827a1c2646cf3b58f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 10:06:56 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f9d26f84440f7e3bde72f9674f205f83ca47255b101bf5827a1c2646cf3b58f/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:06:56 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f9d26f84440f7e3bde72f9674f205f83ca47255b101bf5827a1c2646cf3b58f/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.tykohi-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:06:56 compute-2 podman[190331]: 2026-01-23 10:06:56.101867321 +0000 UTC m=+0.313155852 container init cc70552ffcc96627532b5a08d41512b300ebec5bdbe07c25e585b097491a9291 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Jan 23 10:06:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:56.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:56 compute-2 podman[190331]: 2026-01-23 10:06:56.109165292 +0000 UTC m=+0.320453813 container start cc70552ffcc96627532b5a08d41512b300ebec5bdbe07c25e585b097491a9291 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 10:06:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:06:56 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 10:06:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:06:56 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 10:06:56 compute-2 sudo[190350]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:06:56 compute-2 sudo[190350]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:06:56 compute-2 sudo[190350]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:06:56 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 10:06:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:06:56 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 10:06:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:06:56 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 10:06:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:06:56 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 10:06:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:06:56 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 10:06:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:06:56 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:06:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:56 compute-2 bash[190331]: cc70552ffcc96627532b5a08d41512b300ebec5bdbe07c25e585b097491a9291
Jan 23 10:06:56 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:06:56 compute-2 sudo[190535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oplufrudfqnzovksiytqxquyijxzccwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162815.4426026-2307-106569025354388/AnsiballZ_copy.py'
Jan 23 10:06:56 compute-2 sudo[190535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:56 compute-2 python3.9[190537]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162815.4426026-2307-106569025354388/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:56 compute-2 sudo[190535]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:56.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:57 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:06:57 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:06:57 compute-2 ceph-mon[75771]: pgmap v402: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Jan 23 10:06:57 compute-2 sudo[190700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfpcdfpfsxclntweotryiwbgzdmmkefc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162816.8482833-2307-215245772927508/AnsiballZ_stat.py'
Jan 23 10:06:57 compute-2 sudo[190700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:57 compute-2 podman[190661]: 2026-01-23 10:06:57.122577348 +0000 UTC m=+0.053253402 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 23 10:06:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:57 compute-2 python3.9[190708]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:57 compute-2 sudo[190700]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:57 compute-2 sudo[190829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghrylehgtnriukkachgwehanxfngyhtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162816.8482833-2307-215245772927508/AnsiballZ_copy.py'
Jan 23 10:06:57 compute-2 sudo[190829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:58 compute-2 python3.9[190831]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162816.8482833-2307-215245772927508/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:58 compute-2 sudo[190829]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:58.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:58 compute-2 sudo[190983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbuadrrhgpugofhwgjufptpgezcwbhop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162818.2063055-2307-33683673485829/AnsiballZ_stat.py'
Jan 23 10:06:58 compute-2 sudo[190983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:58 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:06:58 compute-2 python3.9[190985]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:58 compute-2 sudo[190983]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:06:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:06:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:58.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:06:58 compute-2 sudo[191106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cktdckhbhahohhopvzfqfkllwgytzlfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162818.2063055-2307-33683673485829/AnsiballZ_copy.py'
Jan 23 10:06:58 compute-2 sudo[191106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:59 compute-2 python3.9[191108]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162818.2063055-2307-33683673485829/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:59 compute-2 sudo[191106]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:06:59 compute-2 ceph-mon[75771]: pgmap v403: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Jan 23 10:06:59 compute-2 sudo[191258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avpjocohzyhlmrmysocffezimgooxacx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162819.2830079-2307-86897461180381/AnsiballZ_stat.py'
Jan 23 10:06:59 compute-2 sudo[191258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:59 compute-2 python3.9[191260]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:59 compute-2 sudo[191258]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:00 compute-2 sudo[191400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzizbhbdlinuzfcttffgwnvfxaiuswhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162819.2830079-2307-86897461180381/AnsiballZ_copy.py'
Jan 23 10:07:00 compute-2 sudo[191400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:00 compute-2 sudo[191366]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:07:00 compute-2 sudo[191366]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:07:00 compute-2 sudo[191366]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:00.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:00 compute-2 python3.9[191407]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162819.2830079-2307-86897461180381/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:00 compute-2 sudo[191400]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:00 compute-2 sudo[191560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqqhpiuerfbyoiksoiqwdqtpyfgxoxuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162820.340449-2307-51118746440456/AnsiballZ_stat.py'
Jan 23 10:07:00 compute-2 sudo[191560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:00.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:01 compute-2 python3.9[191562]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:01 compute-2 sudo[191560]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:01 compute-2 sudo[191683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gushwavmstqrwtcmulqkiwkeyxoqvspd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162820.340449-2307-51118746440456/AnsiballZ_copy.py'
Jan 23 10:07:01 compute-2 sudo[191683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:01 compute-2 ceph-mon[75771]: pgmap v404: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Jan 23 10:07:01 compute-2 python3.9[191685]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162820.340449-2307-51118746440456/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:01 compute-2 sudo[191683]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:02.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:02 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:07:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:02 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:07:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:07:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:02.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:07:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:03 compute-2 python3.9[191837]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:07:03 compute-2 ceph-mon[75771]: pgmap v405: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 511 B/s wr, 1 op/s
Jan 23 10:07:03 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:07:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:04 compute-2 sudo[191991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugyzhunhwpfnlhjwacczquhkvrjcgnsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162823.6091647-2925-187953169146583/AnsiballZ_seboolean.py'
Jan 23 10:07:04 compute-2 sudo[191991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:04.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:04 compute-2 python3.9[191993]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 23 10:07:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:04.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:05 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:07:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:05 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:07:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:05 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:07:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:05 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:07:05 compute-2 ceph-mon[75771]: pgmap v406: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 511 B/s wr, 1 op/s
Jan 23 10:07:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:07:05 compute-2 sudo[191991]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:06 compute-2 sudo[192149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oenmdagtahdqludrtemefcolxiyfobus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162825.7553308-2948-81040227436699/AnsiballZ_copy.py'
Jan 23 10:07:06 compute-2 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 23 10:07:06 compute-2 sudo[192149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:06.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:06 compute-2 python3.9[192151]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:06 compute-2 sudo[192149]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:06 compute-2 sudo[192302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbxfrmubrtuzuaebdtuikinhubalzwwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162826.369311-2948-198051832880278/AnsiballZ_copy.py'
Jan 23 10:07:06 compute-2 sudo[192302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:06 compute-2 python3.9[192304]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:06.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:06 compute-2 sudo[192302]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:06 compute-2 ceph-mon[75771]: pgmap v407: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:07:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:07 compute-2 sudo[192454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olsteuscbvvpvtwahpsmwndeqtrcfscz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162827.03442-2948-178538920090762/AnsiballZ_copy.py'
Jan 23 10:07:07 compute-2 sudo[192454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:07 compute-2 python3.9[192456]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:07 compute-2 sudo[192454]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:07 compute-2 sudo[192606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdhpdzzixhnuklicwghbvppoeuougoop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162827.6478286-2948-201298905725128/AnsiballZ_copy.py'
Jan 23 10:07:07 compute-2 sudo[192606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:08 compute-2 python3.9[192609]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:08.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:08 compute-2 sudo[192606]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a20000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:08 compute-2 sudo[192775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogkjbmulynqgfrghbvfqsuymrmmfeypx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162828.2577653-2948-65427441718019/AnsiballZ_copy.py'
Jan 23 10:07:08 compute-2 sudo[192775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:08 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:07:08 compute-2 python3.9[192777]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:08 compute-2 sudo[192775]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:08.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:09 compute-2 ceph-mon[75771]: pgmap v408: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:07:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:09 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a20000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:09 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a14000fb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:10 compute-2 sudo[192928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-viwrfduxdyclygulqetyhqaechstqiob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162829.8226848-3057-127309045232228/AnsiballZ_copy.py'
Jan 23 10:07:10 compute-2 sudo[192928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:10.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:10 compute-2 python3.9[192930]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:10 compute-2 sudo[192928]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:10 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a18001550 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:10 compute-2 sudo[193081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dydqcltkepfoxvyvphidfzsxfuguqbmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162830.390148-3057-117455423982772/AnsiballZ_copy.py'
Jan 23 10:07:10 compute-2 sudo[193081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:10 compute-2 python3.9[193083]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:10 compute-2 sudo[193081]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:10.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:11 compute-2 sudo[193233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwvaidvjjybnorkkxvkuupklcuhzujwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162830.9510362-3057-148696413558422/AnsiballZ_copy.py'
Jan 23 10:07:11 compute-2 sudo[193233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:11 compute-2 python3.9[193235]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:11 compute-2 sudo[193233]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:11 compute-2 ceph-mon[75771]: pgmap v409: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:07:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100711 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:07:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:11 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a00001140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:11 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a20001dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:11 compute-2 sudo[193385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtlqfnhvanzheesqmgksccififlpmsfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162831.5042953-3057-99749877959703/AnsiballZ_copy.py'
Jan 23 10:07:11 compute-2 sudo[193385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:11 compute-2 python3.9[193387]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:11 compute-2 sudo[193385]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:12.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:12 compute-2 sudo[193539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvtvxlhnikhjrqaxzvopxuyputlhewqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162832.1030009-3057-210063550957592/AnsiballZ_copy.py'
Jan 23 10:07:12 compute-2 sudo[193539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:12 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a20001dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:12 compute-2 python3.9[193541]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:12 compute-2 sudo[193539]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:12 compute-2 ceph-mon[75771]: pgmap v410: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.0 KiB/s rd, 1.1 KiB/s wr, 4 op/s
Jan 23 10:07:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.834952) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162832835343, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 710, "num_deletes": 251, "total_data_size": 1525849, "memory_usage": 1546608, "flush_reason": "Manual Compaction"}
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162832847487, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 986082, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17349, "largest_seqno": 18054, "table_properties": {"data_size": 982539, "index_size": 1387, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 7958, "raw_average_key_size": 19, "raw_value_size": 975535, "raw_average_value_size": 2373, "num_data_blocks": 61, "num_entries": 411, "num_filter_entries": 411, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162786, "oldest_key_time": 1769162786, "file_creation_time": 1769162832, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 12529 microseconds, and 8171 cpu microseconds.
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.847598) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 986082 bytes OK
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.847649) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.851185) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.851220) EVENT_LOG_v1 {"time_micros": 1769162832851209, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.851246) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 1522032, prev total WAL file size 1522032, number of live WAL files 2.
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.852140) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(962KB)], [30(12MB)]
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162832852304, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 14160243, "oldest_snapshot_seqno": -1}
Jan 23 10:07:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:12.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4876 keys, 11785648 bytes, temperature: kUnknown
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162832957101, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 11785648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11752290, "index_size": 20064, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12229, "raw_key_size": 123329, "raw_average_key_size": 25, "raw_value_size": 11662851, "raw_average_value_size": 2391, "num_data_blocks": 835, "num_entries": 4876, "num_filter_entries": 4876, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769162832, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.957418) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 11785648 bytes
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.959046) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 134.9 rd, 112.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 12.6 +0.0 blob) out(11.2 +0.0 blob), read-write-amplify(26.3) write-amplify(12.0) OK, records in: 5391, records dropped: 515 output_compression: NoCompression
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.959066) EVENT_LOG_v1 {"time_micros": 1769162832959057, "job": 16, "event": "compaction_finished", "compaction_time_micros": 104930, "compaction_time_cpu_micros": 28842, "output_level": 6, "num_output_files": 1, "total_output_size": 11785648, "num_input_records": 5391, "num_output_records": 4876, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162832959488, "job": 16, "event": "table_file_deletion", "file_number": 32}
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162832962718, "job": 16, "event": "table_file_deletion", "file_number": 30}
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.851956) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.962855) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.962861) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.962862) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.962864) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:07:12 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.962866) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:07:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:13 compute-2 sudo[193691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojscgjelsfmtcszqpyzgwhxnnkrnmpit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162833.033335-3165-70785266692841/AnsiballZ_systemd.py'
Jan 23 10:07:13 compute-2 sudo[193691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:13 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:07:13 compute-2 python3.9[193693]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 10:07:13 compute-2 systemd[1]: Reloading.
Jan 23 10:07:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:13 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a180021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:13 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a00001c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:13 compute-2 systemd-rc-local-generator[193719]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:07:13 compute-2 systemd-sysv-generator[193722]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:07:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:13 compute-2 systemd[1]: Starting libvirt logging daemon socket...
Jan 23 10:07:13 compute-2 systemd[1]: Listening on libvirt logging daemon socket.
Jan 23 10:07:13 compute-2 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 23 10:07:13 compute-2 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 23 10:07:13 compute-2 systemd[1]: Starting libvirt logging daemon...
Jan 23 10:07:14 compute-2 systemd[1]: Started libvirt logging daemon.
Jan 23 10:07:14 compute-2 sudo[193691]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:07:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:14.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:07:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:14 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a14001c90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:14 compute-2 sudo[193885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uolyevlmhxgymtxfkagvikvygaazxnsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162834.2151437-3165-213455194143228/AnsiballZ_systemd.py'
Jan 23 10:07:14 compute-2 sudo[193885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100714 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:07:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:14 compute-2 python3.9[193887]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 10:07:14 compute-2 systemd[1]: Reloading.
Jan 23 10:07:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:14.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:14 compute-2 systemd-rc-local-generator[193914]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:07:14 compute-2 systemd-sysv-generator[193917]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:07:15 compute-2 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 23 10:07:15 compute-2 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 23 10:07:15 compute-2 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 23 10:07:15 compute-2 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 23 10:07:15 compute-2 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 23 10:07:15 compute-2 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 23 10:07:15 compute-2 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 23 10:07:15 compute-2 systemd[1]: Starting libvirt nodedev daemon...
Jan 23 10:07:15 compute-2 systemd[1]: Started libvirt nodedev daemon.
Jan 23 10:07:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:15 compute-2 sudo[193885]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:15 compute-2 ceph-mon[75771]: pgmap v411: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 597 B/s wr, 2 op/s
Jan 23 10:07:15 compute-2 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 23 10:07:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:15 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a20001dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:15 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a180021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:15 compute-2 sudo[194105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hehegemsnkjtbhulnociayargzpgyonu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162835.3870978-3165-135652406597458/AnsiballZ_systemd.py'
Jan 23 10:07:15 compute-2 sudo[194105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:15 compute-2 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 23 10:07:15 compute-2 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 23 10:07:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:15 compute-2 python3.9[194110]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 10:07:15 compute-2 systemd[1]: Reloading.
Jan 23 10:07:16 compute-2 systemd-rc-local-generator[194142]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:07:16 compute-2 systemd-sysv-generator[194147]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:07:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:16.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:16 compute-2 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 23 10:07:16 compute-2 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 23 10:07:16 compute-2 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 23 10:07:16 compute-2 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 23 10:07:16 compute-2 systemd[1]: Starting libvirt proxy daemon...
Jan 23 10:07:16 compute-2 systemd[1]: Started libvirt proxy daemon.
Jan 23 10:07:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:16 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a00001c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:16 compute-2 sudo[194105]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:16 compute-2 setroubleshoot[193924]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 05acf4e4-e5f9-415e-a73e-9b333aec0c09
Jan 23 10:07:16 compute-2 setroubleshoot[193924]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 23 10:07:16 compute-2 setroubleshoot[193924]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 05acf4e4-e5f9-415e-a73e-9b333aec0c09
Jan 23 10:07:16 compute-2 setroubleshoot[193924]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 23 10:07:16 compute-2 sudo[194326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgexalvckdoyrvqfnhbmvhwedyunxsll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162836.5135417-3165-228577648229014/AnsiballZ_systemd.py'
Jan 23 10:07:16 compute-2 sudo[194326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:07:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:16.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:07:17 compute-2 python3.9[194328]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 10:07:17 compute-2 systemd[1]: Reloading.
Jan 23 10:07:17 compute-2 systemd-sysv-generator[194358]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:07:17 compute-2 systemd-rc-local-generator[194355]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:07:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:17 compute-2 ceph-mon[75771]: pgmap v412: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 597 B/s wr, 2 op/s
Jan 23 10:07:17 compute-2 systemd[1]: Listening on libvirt locking daemon socket.
Jan 23 10:07:17 compute-2 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 23 10:07:17 compute-2 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 23 10:07:17 compute-2 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 23 10:07:17 compute-2 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 23 10:07:17 compute-2 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 23 10:07:17 compute-2 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 23 10:07:17 compute-2 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 23 10:07:17 compute-2 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 23 10:07:17 compute-2 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 23 10:07:17 compute-2 systemd[1]: Starting libvirt QEMU daemon...
Jan 23 10:07:17 compute-2 systemd[1]: Started libvirt QEMU daemon.
Jan 23 10:07:17 compute-2 sudo[194326]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:17 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a00001c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:17 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a14001c90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:17 compute-2 sudo[194542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqlnipjoimvpxaslxugtolivqtdgpszg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162837.676572-3165-112993902388063/AnsiballZ_systemd.py'
Jan 23 10:07:17 compute-2 sudo[194542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:18.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:18 compute-2 python3.9[194544]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 10:07:18 compute-2 systemd[1]: Reloading.
Jan 23 10:07:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:18 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a18002ee0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:18 compute-2 systemd-rc-local-generator[194573]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:07:18 compute-2 systemd-sysv-generator[194576]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:07:18 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:07:18 compute-2 systemd[1]: Starting libvirt secret daemon socket...
Jan 23 10:07:18 compute-2 systemd[1]: Listening on libvirt secret daemon socket.
Jan 23 10:07:18 compute-2 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 23 10:07:18 compute-2 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 23 10:07:18 compute-2 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 23 10:07:18 compute-2 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 23 10:07:18 compute-2 systemd[1]: Starting libvirt secret daemon...
Jan 23 10:07:18 compute-2 systemd[1]: Started libvirt secret daemon.
Jan 23 10:07:18 compute-2 sudo[194542]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:18.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:19 compute-2 ceph-mon[75771]: pgmap v413: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:07:19 compute-2 sudo[194755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjspjvbrlaigvbvqgfwpgwvadtybuxqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162839.3066196-3276-45660633049658/AnsiballZ_file.py'
Jan 23 10:07:19 compute-2 sudo[194755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:19 compute-2 python3.9[194757]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:19 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a20001dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:19 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a200095a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:19 compute-2 sudo[194755]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:20 compute-2 sudo[194840]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:07:20 compute-2 sudo[194840]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:07:20 compute-2 sudo[194840]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:20.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:20 compute-2 sudo[194936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyxbnscvqoxdpncuklwrduxlxvacqcgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162840.0011818-3300-23100158715189/AnsiballZ_find.py'
Jan 23 10:07:20 compute-2 sudo[194936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:20 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a200095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:20 compute-2 python3.9[194938]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 10:07:20 compute-2 sudo[194936]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:07:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:20 compute-2 sudo[195089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhpjxmurdzsgjhsknranpocbjflxbygn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162840.6529984-3323-52194340577919/AnsiballZ_command.py'
Jan 23 10:07:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:20.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:20 compute-2 sudo[195089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:21 compute-2 python3.9[195091]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;
                                             echo ceph
                                             awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:07:21 compute-2 sudo[195089]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:21 compute-2 ceph-mon[75771]: pgmap v414: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:07:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:21 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a20001dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:21 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99fc000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:21 compute-2 python3.9[195245]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 10:07:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:22.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:22 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a18003800 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:22 compute-2 podman[195368]: 2026-01-23 10:07:22.693785292 +0000 UTC m=+0.105182161 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:07:22 compute-2 python3.9[195407]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:07:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:22.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:07:23 compute-2 ceph-mon[75771]: pgmap v415: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:07:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:23 compute-2 python3.9[195544]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162842.3597481-3381-72835351102307/.source.xml follow=False _original_basename=secret.xml.j2 checksum=19688f6e42a741164eafec41a84b8e73a76d185a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:23 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:07:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:23 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99f8000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:23 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a200095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:23 compute-2 sudo[195694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-biyequxxbppjutztxiautwkyggqxkscb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162843.5938182-3425-78557840479402/AnsiballZ_command.py'
Jan 23 10:07:23 compute-2 sudo[195694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:24 compute-2 python3.9[195696]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine f3005f84-239a-55b6-a948-8f1fb592b920
                                             virsh secret-define --file /tmp/secret.xml
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:07:24 compute-2 polkitd[43445]: Registered Authentication Agent for unix-process:195699:399674 (system bus name :1.1838 [pkttyagent --process 195699 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Jan 23 10:07:24 compute-2 polkitd[43445]: Unregistered Authentication Agent for unix-process:195699:399674 (system bus name :1.1838, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Jan 23 10:07:24 compute-2 polkitd[43445]: Registered Authentication Agent for unix-process:195698:399674 (system bus name :1.1839 [pkttyagent --process 195698 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Jan 23 10:07:24 compute-2 polkitd[43445]: Unregistered Authentication Agent for unix-process:195698:399674 (system bus name :1.1839, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Jan 23 10:07:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:24.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:24 compute-2 sudo[195694]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:24 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99fc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:24 compute-2 python3.9[195860]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:24.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:25 compute-2 sudo[196010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idbjyhcbbfxnnraubrkfxhdkbxznlgpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162845.147555-3473-256228758252685/AnsiballZ_command.py'
Jan 23 10:07:25 compute-2 sudo[196010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:25 compute-2 ceph-mon[75771]: pgmap v416: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:25 compute-2 sudo[196010]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:25 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99f8000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:25 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a18004120 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:26 compute-2 sudo[196164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjnwetsqpbulmoakvtupvgwpfybuhsdl ; FSID=f3005f84-239a-55b6-a948-8f1fb592b920 KEY=AQB8Q3NpAAAAABAATAj6yCl+1UaIO/yyy7nUXA== /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162845.8328023-3497-149157165036477/AnsiballZ_command.py'
Jan 23 10:07:26 compute-2 sudo[196164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:26.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:26 compute-2 polkitd[43445]: Registered Authentication Agent for unix-process:196168:399902 (system bus name :1.1842 [pkttyagent --process 196168 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Jan 23 10:07:26 compute-2 polkitd[43445]: Unregistered Authentication Agent for unix-process:196168:399902 (system bus name :1.1842, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Jan 23 10:07:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:26 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a200095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:26 compute-2 sudo[196164]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:26 compute-2 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 23 10:07:26 compute-2 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.028s CPU time.
Jan 23 10:07:26 compute-2 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 23 10:07:26 compute-2 sudo[196323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbwyjknudckgkqviezhosxclbwsdlnih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162846.565602-3521-198354480991888/AnsiballZ_copy.py'
Jan 23 10:07:26 compute-2 sudo[196323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:26.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:27 compute-2 python3.9[196325]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:27 compute-2 sudo[196323]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:27 compute-2 ceph-mon[75771]: pgmap v417: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:27 compute-2 sudo[196485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snikcjuytxwfoyxbfyrajfzjszptasqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162847.223294-3546-78742458669500/AnsiballZ_stat.py'
Jan 23 10:07:27 compute-2 sudo[196485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:27 compute-2 podman[196449]: 2026-01-23 10:07:27.518402039 +0000 UTC m=+0.059700953 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 10:07:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:27 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99fc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:27 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99f8001ca0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:27 compute-2 python3.9[196497]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:27 compute-2 sudo[196485]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:28 compute-2 sudo[196619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdysjyczodmaqtpnnssuritnaqgnhsgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162847.223294-3546-78742458669500/AnsiballZ_copy.py'
Jan 23 10:07:28 compute-2 sudo[196619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:28.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:28 compute-2 python3.9[196621]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162847.223294-3546-78742458669500/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:28 compute-2 sudo[196619]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:28 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a18004120 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:28 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:07:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:28 compute-2 sudo[196772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emqlpdhshqpjrvvfusbwckvvbvptrmxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162848.6020827-3594-239458060141553/AnsiballZ_file.py'
Jan 23 10:07:28 compute-2 sudo[196772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:28.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:29 compute-2 python3.9[196774]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:29 compute-2 sudo[196772]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:29 compute-2 sudo[196924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctwwauowoavlceehtyujmeagrzozwaoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162849.2754357-3618-53621165379084/AnsiballZ_stat.py'
Jan 23 10:07:29 compute-2 sudo[196924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:29 compute-2 ceph-mon[75771]: pgmap v418: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:29 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a200095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:29 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99fc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:29 compute-2 python3.9[196926]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:29 compute-2 sudo[196924]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:29 compute-2 sudo[197003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmyyrjiunqfsktkoixtkrlotdhpmsqpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162849.2754357-3618-53621165379084/AnsiballZ_file.py'
Jan 23 10:07:29 compute-2 sudo[197003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:30.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:30 compute-2 python3.9[197005]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:30 compute-2 sudo[197003]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:30 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99f8001ca0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:30 compute-2 sudo[197156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcseftfxzyeweglqwykwsrvqoqgsidix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162850.4511154-3654-58170248791271/AnsiballZ_stat.py'
Jan 23 10:07:30 compute-2 sudo[197156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:30 compute-2 python3.9[197158]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:30 compute-2 sudo[197156]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:30.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:31 compute-2 sudo[197234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtgcbqvoxpalvaxkedbiqxccjpyusicc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162850.4511154-3654-58170248791271/AnsiballZ_file.py'
Jan 23 10:07:31 compute-2 sudo[197234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:31 compute-2 python3.9[197236]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.mba36d0j recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:31 compute-2 sudo[197234]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:31 compute-2 auditd[703]: Audit daemon rotating log files
Jan 23 10:07:31 compute-2 ceph-mon[75771]: pgmap v419: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:31 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a18004e30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:31 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99f8001ca0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:31 compute-2 sudo[197386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctiornqlizvxpzpcabdqexnceinmescw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162851.5347638-3690-60883224563203/AnsiballZ_stat.py'
Jan 23 10:07:31 compute-2 sudo[197386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:31 compute-2 python3.9[197388]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:32 compute-2 sudo[197386]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:32.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:32 compute-2 sudo[197465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yceywantnnptpteshpmyutvggocqspsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162851.5347638-3690-60883224563203/AnsiballZ_file.py'
Jan 23 10:07:32 compute-2 sudo[197465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:32 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a200095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:32 compute-2 python3.9[197467]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:32 compute-2 sudo[197465]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:32.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:32 compute-2 sudo[197618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivqozquumcxetacqjkzpjgqawxrtlzyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162852.7141306-3729-171539574628507/AnsiballZ_command.py'
Jan 23 10:07:32 compute-2 sudo[197618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:33 compute-2 python3.9[197620]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:07:33 compute-2 sudo[197618]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:33 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:07:33 compute-2 ceph-mon[75771]: pgmap v420: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:07:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:33 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a18004e30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:33 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a18004e30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:33 compute-2 sudo[197771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skdimfkndxxohczbnzkxqyjbiscaicsw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769162853.3777707-3753-141442326613747/AnsiballZ_edpm_nftables_from_files.py'
Jan 23 10:07:33 compute-2 sudo[197771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:33 compute-2 python3[197773]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 10:07:33 compute-2 sudo[197771]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:07:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:34.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:07:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:34 compute-2 sshd-session[197875]: banner exchange: Connection from 64.62.156.162 port 60628: invalid format
Jan 23 10:07:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:34 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99f8001ca0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:34 compute-2 sudo[197926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjtxeegsobnqfdqmnkdfkgqufybrdxqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162854.132546-3776-34072434107431/AnsiballZ_stat.py'
Jan 23 10:07:34 compute-2 sudo[197926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:34 compute-2 python3.9[197928]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:34 compute-2 sudo[197926]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:34 compute-2 ceph-mon[75771]: pgmap v421: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:34 compute-2 sudo[198004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmfcwwoushpoxhsyehvlfyfbinzacoqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162854.132546-3776-34072434107431/AnsiballZ_file.py'
Jan 23 10:07:34 compute-2 sudo[198004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:34.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:35 compute-2 python3.9[198006]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:35 compute-2 sudo[198004]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:35 compute-2 sudo[198156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtjyrfxkpldxoywtvxpcrnbrupicuxee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162855.257572-3813-12043831208285/AnsiballZ_stat.py'
Jan 23 10:07:35 compute-2 sudo[198156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:35 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a200095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:35 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a200095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:35 compute-2 python3.9[198158]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:07:35 compute-2 sudo[198156]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:36 compute-2 sudo[198282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkmrikavmfcnkhzvpcgzovhevfpwpbjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162855.257572-3813-12043831208285/AnsiballZ_copy.py'
Jan 23 10:07:36 compute-2 sudo[198282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:36.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:36 compute-2 python3.9[198284]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162855.257572-3813-12043831208285/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:36 compute-2 sudo[198282]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:36 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a200095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:36 compute-2 sudo[198435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zamymkkbwordkzxzlsbnjoqcjarjlazg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162856.4900057-3858-223381257738154/AnsiballZ_stat.py'
Jan 23 10:07:36 compute-2 sudo[198435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:36 compute-2 ceph-mon[75771]: pgmap v422: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:36 compute-2 python3.9[198437]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:36.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:36 compute-2 sudo[198435]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:37 compute-2 sudo[198513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpiiemutzosxozdaibaxttlechtcsibr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162856.4900057-3858-223381257738154/AnsiballZ_file.py'
Jan 23 10:07:37 compute-2 sudo[198513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:37 compute-2 python3.9[198515]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:37 compute-2 sudo[198513]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:37 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99f8003490 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:37 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a000030f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:37 compute-2 sudo[198665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzkhjdiktppndkvkdmmhewljnkwnaktj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162857.5523994-3894-14408983166524/AnsiballZ_stat.py'
Jan 23 10:07:37 compute-2 sudo[198665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:38 compute-2 python3.9[198667]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:38 compute-2 sudo[198665]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:38.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:38 compute-2 sudo[198744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vulfpzuwdnjrfjebgiputolpupafzbhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162857.5523994-3894-14408983166524/AnsiballZ_file.py'
Jan 23 10:07:38 compute-2 sudo[198744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:38 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a18004e30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:38 compute-2 python3.9[198746]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:38 compute-2 sudo[198744]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:38 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:07:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:38.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:39 compute-2 sudo[198897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzmbejhbuytrsofhdjqckozkgueirxfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162858.670145-3930-38712543530726/AnsiballZ_stat.py'
Jan 23 10:07:39 compute-2 sudo[198897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:39 compute-2 python3.9[198899]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:39 compute-2 sudo[198897]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:39 compute-2 ceph-mon[75771]: pgmap v423: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:39 compute-2 sudo[199022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdomusvfkkmkhwlbbswvzmyiquhktzmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162858.670145-3930-38712543530726/AnsiballZ_copy.py'
Jan 23 10:07:39 compute-2 sudo[199022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:39 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a200095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:39 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99f8003490 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:39 compute-2 python3.9[199024]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162858.670145-3930-38712543530726/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:39 compute-2 sudo[199022]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:40.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:40 compute-2 sudo[199125]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:07:40 compute-2 sudo[199125]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:07:40 compute-2 sudo[199125]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:40 compute-2 sudo[199200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkrwnnkiaqsvjlvebsylstsgfdbchdpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162860.0124726-3975-151187170133252/AnsiballZ_file.py'
Jan 23 10:07:40 compute-2 sudo[199200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:40 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a000030f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:40 compute-2 python3.9[199202]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:40 compute-2 sudo[199200]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:40.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:40 compute-2 sudo[199353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtwfedyftakahywofigkmgyjuhidnuky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162860.7231777-3999-238488632117416/AnsiballZ_command.py'
Jan 23 10:07:40 compute-2 sudo[199353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:41 compute-2 python3.9[199355]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:07:41 compute-2 sudo[199353]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:41 compute-2 ceph-mon[75771]: pgmap v424: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:41 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a18004e30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:41 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a200095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:41 compute-2 sudo[199508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppyhzekmsxnjabneywoieuoniustysfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162861.4430478-4023-64558906590366/AnsiballZ_blockinfile.py'
Jan 23 10:07:41 compute-2 sudo[199508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:42 compute-2 python3.9[199510]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:42 compute-2 sudo[199508]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:42.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:42 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99f8003490 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:42 compute-2 sudo[199662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpsghjviavlypnwzdalojgyxmjxmhshy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162862.4702458-4050-114493216111396/AnsiballZ_command.py'
Jan 23 10:07:42 compute-2 sudo[199662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:42 compute-2 python3.9[199664]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:07:42 compute-2 sudo[199662]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:42.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:43 compute-2 sudo[199815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmqjtpsxytaakjzlphcvgwtrlyurddca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162863.1867785-4074-247724559126564/AnsiballZ_stat.py'
Jan 23 10:07:43 compute-2 sudo[199815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:43 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:07:43 compute-2 python3.9[199817]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:07:43 compute-2 sudo[199815]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:43 compute-2 ceph-mon[75771]: pgmap v425: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:07:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:43 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a00003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:43 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a18004e30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:44.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:44 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a200095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:44 compute-2 sudo[199971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thdcgpzzugtpycjhtzfnwsiodqgueguo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162864.2224102-4098-63856317834423/AnsiballZ_command.py'
Jan 23 10:07:44 compute-2 sudo[199971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:44 compute-2 python3.9[199973]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:07:44 compute-2 ceph-mon[75771]: pgmap v426: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:44 compute-2 sudo[199971]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:07:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:44.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:07:45 compute-2 sudo[200126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbfihlzocknpbtwupisayqamvnppnhgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162864.8778322-4122-125890235061473/AnsiballZ_file.py'
Jan 23 10:07:45 compute-2 sudo[200126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:45 compute-2 python3.9[200128]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:45 compute-2 sudo[200126]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:45 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99f8003490 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:45 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a00003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:45 compute-2 sudo[200278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcisyjahlvkirkpdvzfguogwauemujac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162865.591543-4146-56111117212129/AnsiballZ_stat.py'
Jan 23 10:07:45 compute-2 sudo[200278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:46 compute-2 python3.9[200280]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:46 compute-2 sudo[200278]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:46.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:46 compute-2 sudo[200403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xytriltanwclpcwzrzlwprhbveegcvzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162865.591543-4146-56111117212129/AnsiballZ_copy.py'
Jan 23 10:07:46 compute-2 sudo[200403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:46 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a18004e30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:46 compute-2 python3.9[200405]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162865.591543-4146-56111117212129/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:46 compute-2 sudo[200403]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:07:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:46.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:07:47 compute-2 sudo[200555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-podzzewvzglewcdnmopgtylhwvaoymxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162866.7864878-4191-11101899059322/AnsiballZ_stat.py'
Jan 23 10:07:47 compute-2 sudo[200555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:47 compute-2 python3.9[200557]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:47 compute-2 sudo[200555]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:47 compute-2 ceph-mon[75771]: pgmap v427: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:47 compute-2 sudo[200678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqqzggwzjooooeecsgsqzwqlzbrxtvhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162866.7864878-4191-11101899059322/AnsiballZ_copy.py'
Jan 23 10:07:47 compute-2 sudo[200678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:47 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a20009f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:47 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99f8003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:47 compute-2 python3.9[200680]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162866.7864878-4191-11101899059322/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:47 compute-2 sudo[200678]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:48.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:48 compute-2 sudo[200832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqqsnsvaxedgwoehuzukxpnvxeyxjtpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162868.0675416-4236-208739899222965/AnsiballZ_stat.py'
Jan 23 10:07:48 compute-2 sudo[200832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:48 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a00003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:48 compute-2 python3.9[200834]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:48 compute-2 sudo[200832]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:48 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:07:48 compute-2 sudo[200955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtinllmeglalcgjxkvuigfvrgascroxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162868.0675416-4236-208739899222965/AnsiballZ_copy.py'
Jan 23 10:07:48 compute-2 sudo[200955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:48.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:48 compute-2 python3.9[200957]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162868.0675416-4236-208739899222965/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:49 compute-2 sudo[200955]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:49 compute-2 ceph-mon[75771]: pgmap v428: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:49 compute-2 sudo[201107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rikpyosynufdsgxpxnlnbrqevqykiiee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162869.347002-4281-52370138072062/AnsiballZ_systemd.py'
Jan 23 10:07:49 compute-2 sudo[201107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:49 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99fc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:49 compute-2 kernel: ganesha.nfsd[194760]: segfault at 50 ip 00007f9aaabbe32e sp 00007f9a11ffa210 error 4 in libntirpc.so.5.8[7f9aaaba3000+2c000] likely on CPU 1 (core 0, socket 1)
Jan 23 10:07:49 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 10:07:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:49 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99fc002b10 fd 38 proxy ignored for local
Jan 23 10:07:49 compute-2 systemd[1]: Started Process Core Dump (PID 201110/UID 0).
Jan 23 10:07:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:49 compute-2 python3.9[201109]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:07:49 compute-2 systemd[1]: Reloading.
Jan 23 10:07:50 compute-2 systemd-rc-local-generator[201139]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:07:50 compute-2 systemd-sysv-generator[201142]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:07:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:50.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:50 compute-2 systemd[1]: Reached target edpm_libvirt.target.
Jan 23 10:07:50 compute-2 sudo[201107]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:50 compute-2 sudo[201304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyqxsgbafuekbqhruvulyviuowjehvpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162870.6286197-4305-50351525110414/AnsiballZ_systemd.py'
Jan 23 10:07:50 compute-2 sudo[201304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:07:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:07:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:50.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:07:51 compute-2 systemd-coredump[201111]: Process 190355 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 57:
                                                    #0  0x00007f9aaabbe32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    #1  0x0000000000000000 n/a (n/a + 0x0)
                                                    #2  0x00007f9aaabc8900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)
                                                    ELF object binary architecture: AMD x86-64
Jan 23 10:07:51 compute-2 python3.9[201306]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 23 10:07:51 compute-2 systemd[1]: Reloading.
Jan 23 10:07:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:51 compute-2 systemd-rc-local-generator[201339]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:07:51 compute-2 systemd-sysv-generator[201343]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:07:51 compute-2 podman[201309]: 2026-01-23 10:07:51.338951704 +0000 UTC m=+0.038551528 container died cc70552ffcc96627532b5a08d41512b300ebec5bdbe07c25e585b097491a9291 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 10:07:51 compute-2 podman[201309]: 2026-01-23 10:07:51.376625159 +0000 UTC m=+0.076224953 container remove cc70552ffcc96627532b5a08d41512b300ebec5bdbe07c25e585b097491a9291 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid)
Jan 23 10:07:51 compute-2 systemd[1]: var-lib-containers-storage-overlay-4f9d26f84440f7e3bde72f9674f205f83ca47255b101bf5827a1c2646cf3b58f-merged.mount: Deactivated successfully.
Jan 23 10:07:51 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Main process exited, code=exited, status=139/n/a
Jan 23 10:07:51 compute-2 systemd[1]: systemd-coredump@7-201110-0.service: Deactivated successfully.
Jan 23 10:07:51 compute-2 systemd[1]: systemd-coredump@7-201110-0.service: Consumed 1.457s CPU time.
Jan 23 10:07:51 compute-2 systemd[1]: Reloading.
Jan 23 10:07:51 compute-2 systemd-rc-local-generator[201416]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:07:51 compute-2 systemd-sysv-generator[201419]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:07:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:51 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 10:07:51 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.810s CPU time.
Jan 23 10:07:51 compute-2 sudo[201304]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:51 compute-2 ceph-mon[75771]: pgmap v429: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:52.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:52 compute-2 sshd-session[143197]: Connection closed by 192.168.122.30 port 34802
Jan 23 10:07:52 compute-2 sshd-session[143194]: pam_unix(sshd:session): session closed for user zuul
Jan 23 10:07:52 compute-2 systemd[1]: session-52.scope: Deactivated successfully.
Jan 23 10:07:52 compute-2 systemd[1]: session-52.scope: Consumed 3min 24.487s CPU time.
Jan 23 10:07:52 compute-2 systemd-logind[786]: Session 52 logged out. Waiting for processes to exit.
Jan 23 10:07:52 compute-2 systemd-logind[786]: Removed session 52.
Jan 23 10:07:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:52.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:52 compute-2 ceph-mon[75771]: pgmap v430: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:07:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:53 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:07:53 compute-2 podman[201451]: 2026-01-23 10:07:53.670683112 +0000 UTC m=+0.096709101 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 23 10:07:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:54.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:54.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:55 compute-2 ceph-mon[75771]: pgmap v431: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:07:55.469 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:07:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:07:55.470 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:07:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:07:55.470 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:07:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100755 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:07:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:07:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:56.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:07:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:56 compute-2 sudo[201481]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:07:56 compute-2 sudo[201481]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:07:56 compute-2 sudo[201481]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:56 compute-2 sudo[201507]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:07:56 compute-2 sudo[201507]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:07:56 compute-2 sudo[201507]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:56.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:57 compute-2 ceph-mon[75771]: pgmap v432: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:57 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:07:57 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:07:57 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:07:57 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:07:57 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:07:57 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:07:57 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:07:57 compute-2 podman[201566]: 2026-01-23 10:07:57.622616164 +0000 UTC m=+0.048764271 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 10:07:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:57 compute-2 sshd-session[201587]: Accepted publickey for zuul from 192.168.122.30 port 41990 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 10:07:57 compute-2 systemd-logind[786]: New session 53 of user zuul.
Jan 23 10:07:57 compute-2 systemd[1]: Started Session 53 of User zuul.
Jan 23 10:07:57 compute-2 sshd-session[201587]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 10:07:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:58.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:58 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:07:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:58 compute-2 python3.9[201742]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 10:07:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:07:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:07:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:58.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:07:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:07:59 compute-2 ceph-mon[75771]: pgmap v433: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:00.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:00 compute-2 sudo[201898]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:08:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:00 compute-2 sudo[201898]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:08:00 compute-2 sudo[201898]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:00 compute-2 python3.9[201897]: ansible-ansible.builtin.service_facts Invoked
Jan 23 10:08:00 compute-2 network[201940]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 10:08:00 compute-2 network[201941]: 'network-scripts' will be removed from distribution in near future.
Jan 23 10:08:00 compute-2 network[201942]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 10:08:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:01.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:01 compute-2 ceph-mon[75771]: pgmap v434: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:01 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Scheduled restart job, restart counter is at 8.
Jan 23 10:08:01 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:08:01 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.810s CPU time.
Jan 23 10:08:01 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 10:08:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:02.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:02 compute-2 podman[202049]: 2026-01-23 10:08:02.209384139 +0000 UTC m=+0.043938142 container create 075bd0906bffbd67deee1972885ea17223f9ea73d23664c030537d727fd0a3ab (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 10:08:02 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c2e895362fded70b70c158ad61887411a6935be2a9259c6a533cbeaa6d0ebd47/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 10:08:02 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c2e895362fded70b70c158ad61887411a6935be2a9259c6a533cbeaa6d0ebd47/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 10:08:02 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c2e895362fded70b70c158ad61887411a6935be2a9259c6a533cbeaa6d0ebd47/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:08:02 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c2e895362fded70b70c158ad61887411a6935be2a9259c6a533cbeaa6d0ebd47/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.tykohi-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:08:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:02 compute-2 podman[202049]: 2026-01-23 10:08:02.270440543 +0000 UTC m=+0.104994536 container init 075bd0906bffbd67deee1972885ea17223f9ea73d23664c030537d727fd0a3ab (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 10:08:02 compute-2 podman[202049]: 2026-01-23 10:08:02.27673612 +0000 UTC m=+0.111290113 container start 075bd0906bffbd67deee1972885ea17223f9ea73d23664c030537d727fd0a3ab (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 10:08:02 compute-2 bash[202049]: 075bd0906bffbd67deee1972885ea17223f9ea73d23664c030537d727fd0a3ab
Jan 23 10:08:02 compute-2 podman[202049]: 2026-01-23 10:08:02.191546926 +0000 UTC m=+0.026100939 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 10:08:02 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:08:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:02 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 10:08:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:02 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 10:08:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:02 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 10:08:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:02 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 10:08:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:02 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 10:08:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:02 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 10:08:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:02 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 10:08:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:02 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:08:02 compute-2 sudo[202110]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:08:02 compute-2 sudo[202110]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:08:02 compute-2 sudo[202110]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:03.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:03 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:08:03 compute-2 ceph-mon[75771]: pgmap v435: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:08:03 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:08:03 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:08:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:04.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:04 compute-2 sudo[202339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdqsdwjhxymvzuivieoluwmnjrhevaut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162884.6102393-98-138012193384330/AnsiballZ_setup.py'
Jan 23 10:08:04 compute-2 sudo[202339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:05.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:05 compute-2 ceph-mon[75771]: pgmap v436: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:08:05 compute-2 python3.9[202341]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 10:08:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:05 compute-2 sudo[202339]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:05 compute-2 sudo[202423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulhbfgyjklfdkqamqglenswaaadgnvqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162884.6102393-98-138012193384330/AnsiballZ_dnf.py'
Jan 23 10:08:05 compute-2 sudo[202423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:06 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:08:06 compute-2 python3.9[202425]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 10:08:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:06.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:07.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:07 compute-2 ceph-mon[75771]: pgmap v437: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 10:08:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:08.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:08 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:08:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:08 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:08:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:08 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:08:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:09.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:09 compute-2 ceph-mon[75771]: pgmap v438: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 10:08:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:08:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:10.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:08:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:10 compute-2 ceph-mon[75771]: pgmap v439: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 10:08:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:11.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:11 compute-2 sudo[202423]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:12.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:12 compute-2 sudo[202584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyzekdbzegoxzbfoutowxgwbxurtlmya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162892.1022525-135-67679269651391/AnsiballZ_stat.py'
Jan 23 10:08:12 compute-2 sudo[202584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:12 compute-2 python3.9[202586]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:08:12 compute-2 sudo[202584]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:08:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:13.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:08:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:13 compute-2 sudo[202736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvrahlopwcvvyismjteuktejwqduuchg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162892.980605-165-206397107359729/AnsiballZ_command.py'
Jan 23 10:08:13 compute-2 sudo[202736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:13 compute-2 python3.9[202738]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:08:13 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:08:13 compute-2 sudo[202736]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:13 compute-2 ceph-mon[75771]: pgmap v440: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:08:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:14.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:14 compute-2 sudo[202890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dogtlybnqqrrehpoahvsraykuubgkaxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162893.994417-195-21442373513029/AnsiballZ_stat.py'
Jan 23 10:08:14 compute-2 sudo[202890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:14 compute-2 python3.9[202892]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:08:14 compute-2 sudo[202890]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:14 compute-2 ceph-mon[75771]: pgmap v441: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 10:08:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:14 compute-2 sudo[203056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfkqzhgedwhqeaccwveopautdfyuqsnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162894.6598194-219-189520902116888/AnsiballZ_command.py'
Jan 23 10:08:14 compute-2 sudo[203056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:15.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:15 compute-2 python3.9[203058]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:08:15 compute-2 sudo[203056]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:15 compute-2 sudo[203209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfokwqnshrjelilwetqateucjawqhjnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162895.3848317-243-133858745391706/AnsiballZ_stat.py'
Jan 23 10:08:15 compute-2 sudo[203209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:15 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:15 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:15 compute-2 python3.9[203211]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:08:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:15 compute-2 sudo[203209]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:08:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:16.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:08:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:16 compute-2 sudo[203334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbmqjxnivvualjlxigaeisveyygiyyhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162895.3848317-243-133858745391706/AnsiballZ_copy.py'
Jan 23 10:08:16 compute-2 sudo[203334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:16 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f5c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:16 compute-2 python3.9[203337]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162895.3848317-243-133858745391706/.source.iscsi _original_basename=.1yc12p3o follow=False checksum=91c74ac05ca8208bb4cea7a74bd001dd4447ebae backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:16 compute-2 sudo[203334]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:17.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:17 compute-2 sudo[203487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kysbvrdhwrwoslwjsftgdiafvbncuekk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162896.7687626-288-186571126834354/AnsiballZ_file.py'
Jan 23 10:08:17 compute-2 sudo[203487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:17 compute-2 python3.9[203489]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:17 compute-2 sudo[203487]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:17 compute-2 ceph-mon[75771]: pgmap v442: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:08:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:17 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:17 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100817 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:08:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:17 compute-2 sudo[203640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptdkdiokxtrwaaomkrvwazjirphxhxjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162897.575438-312-210782664563292/AnsiballZ_lineinfile.py'
Jan 23 10:08:17 compute-2 sudo[203640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:18 compute-2 python3.9[203642]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:18 compute-2 sudo[203640]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:18.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:18 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f700016c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:18 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:08:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:19.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:19 compute-2 sudo[203793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byyuqxzekfyoeimquzrngndrljrbnphu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162898.508151-338-278353377302463/AnsiballZ_systemd_service.py'
Jan 23 10:08:19 compute-2 sudo[203793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:19 compute-2 python3.9[203795]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:08:19 compute-2 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 23 10:08:19 compute-2 sudo[203793]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:19 compute-2 ceph-mon[75771]: pgmap v443: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 23 10:08:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:19 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f5c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:19 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:19 compute-2 sudo[203949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlkhcksyzdwunknhyzqvasmczdvkygfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162899.6669855-363-273809226612571/AnsiballZ_systemd_service.py'
Jan 23 10:08:19 compute-2 sudo[203949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:20.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:20 compute-2 python3.9[203951]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:08:20 compute-2 systemd[1]: Reloading.
Jan 23 10:08:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:20 compute-2 systemd-rc-local-generator[204004]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:08:20 compute-2 systemd-sysv-generator[204007]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:08:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:20 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:08:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:20 compute-2 sudo[203957]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:08:20 compute-2 sudo[203957]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:08:20 compute-2 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 23 10:08:20 compute-2 sudo[203957]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:20 compute-2 systemd[1]: Starting Open-iSCSI...
Jan 23 10:08:21 compute-2 kernel: Loading iSCSI transport class v2.0-870.
Jan 23 10:08:21 compute-2 systemd[1]: Started Open-iSCSI.
Jan 23 10:08:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:21.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:21 compute-2 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 23 10:08:21 compute-2 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 23 10:08:21 compute-2 sudo[203949]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:21 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f700021c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:21 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f5c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:22 compute-2 ceph-mon[75771]: pgmap v444: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 23 10:08:22 compute-2 python3.9[204177]: ansible-ansible.builtin.service_facts Invoked
Jan 23 10:08:22 compute-2 network[204194]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 10:08:22 compute-2 network[204195]: 'network-scripts' will be removed from distribution in near future.
Jan 23 10:08:22 compute-2 network[204196]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 10:08:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:22.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:22 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:23.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:23 compute-2 ceph-mon[75771]: pgmap v445: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 23 10:08:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:23 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:08:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:23 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:23 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f700021c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:23 compute-2 podman[204247]: 2026-01-23 10:08:23.814531734 +0000 UTC m=+0.096371625 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 23 10:08:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:24.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:24 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f5c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:25.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:25 compute-2 ceph-mon[75771]: pgmap v446: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:08:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:25 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:25 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:26.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:26 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f700021c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:26 compute-2 sudo[204497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nszrznohabqcwaccruaetkrcmouftvkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162906.3856816-432-117066881282221/AnsiballZ_dnf.py'
Jan 23 10:08:26 compute-2 sudo[204497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:26 compute-2 python3.9[204499]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 10:08:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:27.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:27 compute-2 ceph-mon[75771]: pgmap v447: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:08:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:27 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f5c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:27 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:28.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:28 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:28 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:08:28 compute-2 podman[204507]: 2026-01-23 10:08:28.637567109 +0000 UTC m=+0.055430471 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 10:08:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:08:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:29.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:08:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:29 compute-2 ceph-mon[75771]: pgmap v448: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:29 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:29 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80001930 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:30 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 10:08:30 compute-2 systemd[1]: Starting man-db-cache-update.service...
Jan 23 10:08:30 compute-2 systemd[1]: Reloading.
Jan 23 10:08:30 compute-2 systemd-rc-local-generator[204570]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:08:30 compute-2 systemd-sysv-generator[204573]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:08:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:30.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:30 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 10:08:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:30 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:30 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 10:08:30 compute-2 systemd[1]: Finished man-db-cache-update.service.
Jan 23 10:08:30 compute-2 systemd[1]: run-rdaddf7c78ec94e55a5c922f3bcca0ef9.service: Deactivated successfully.
Jan 23 10:08:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:30 compute-2 sudo[204497]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:31.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:31 compute-2 sudo[204838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uylewrzmgeiufvgqliytyywbtldpfmcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162911.106134-459-207805014064960/AnsiballZ_file.py'
Jan 23 10:08:31 compute-2 sudo[204838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:31 compute-2 python3.9[204840]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 23 10:08:31 compute-2 sudo[204838]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:31 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:31 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:31 compute-2 ceph-mon[75771]: pgmap v449: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:32 compute-2 sudo[204991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxkmmmqehgkduzrjbdhhocwsiajzaonf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162911.793812-482-70122758240468/AnsiballZ_modprobe.py'
Jan 23 10:08:32 compute-2 sudo[204991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:08:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:32.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:08:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:32 compute-2 python3.9[204993]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 23 10:08:32 compute-2 sudo[204991]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:32 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:32 compute-2 ceph-mon[75771]: pgmap v450: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:32 compute-2 sudo[205148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnapyanovbmiwnrdkyulpvvhqougacqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162912.6237454-507-39111210236565/AnsiballZ_stat.py'
Jan 23 10:08:32 compute-2 sudo[205148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:33 compute-2 python3.9[205150]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:08:33 compute-2 sudo[205148]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:33.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:33 compute-2 sudo[205271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stnfciulxlbqxieovrjbtgfszpexlljz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162912.6237454-507-39111210236565/AnsiballZ_copy.py'
Jan 23 10:08:33 compute-2 sudo[205271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:33 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:08:33 compute-2 python3.9[205273]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162912.6237454-507-39111210236565/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:33 compute-2 sudo[205271]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:33 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:33 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:34 compute-2 sudo[205424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrnepjdjhvlgnhtmqcqdixvbcxxawozu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162913.8877535-555-122642267153238/AnsiballZ_lineinfile.py'
Jan 23 10:08:34 compute-2 sudo[205424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:08:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:34.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:08:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:34 compute-2 python3.9[205426]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:34 compute-2 sudo[205424]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:34 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:35.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:35 compute-2 sudo[205577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvrfballvlrgrmwlthmdexzfvpzajetb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162914.5965471-579-94630723934985/AnsiballZ_systemd.py'
Jan 23 10:08:35 compute-2 sudo[205577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:35 compute-2 ceph-mon[75771]: pgmap v451: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:08:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:35 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50001b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:35 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:35 compute-2 python3.9[205579]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 10:08:35 compute-2 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 23 10:08:35 compute-2 systemd[1]: Stopped Load Kernel Modules.
Jan 23 10:08:35 compute-2 systemd[1]: Stopping Load Kernel Modules...
Jan 23 10:08:35 compute-2 systemd[1]: Starting Load Kernel Modules...
Jan 23 10:08:35 compute-2 systemd[1]: Finished Load Kernel Modules.
Jan 23 10:08:35 compute-2 sudo[205577]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:08:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:36.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:08:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:36 compute-2 sudo[205735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsbhhvdmisyzfntopunouhfeewifebxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162916.078632-603-208954784275622/AnsiballZ_command.py'
Jan 23 10:08:36 compute-2 sudo[205735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:36 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:36 compute-2 python3.9[205737]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:08:36 compute-2 sudo[205735]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:37.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:37 compute-2 sudo[205888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sibndhtpdleuuopoxdzxmkjtfktvsjvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162916.9501731-633-112687918377014/AnsiballZ_stat.py'
Jan 23 10:08:37 compute-2 sudo[205888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:37 compute-2 python3.9[205890]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:08:37 compute-2 sudo[205888]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:37 compute-2 ceph-mon[75771]: pgmap v452: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:08:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:37 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:37 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50001b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:37 compute-2 sudo[206041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kawtkjliunnjeealqquqszulxktriaza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162917.7339756-660-73664365490608/AnsiballZ_stat.py'
Jan 23 10:08:37 compute-2 sudo[206041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:38 compute-2 python3.9[206043]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:08:38 compute-2 sudo[206041]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:38.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:38 compute-2 sudo[206165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqcxqmbchlgiaqiqflpxiroesbuhtola ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162917.7339756-660-73664365490608/AnsiballZ_copy.py'
Jan 23 10:08:38 compute-2 sudo[206165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:38 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:38 compute-2 python3.9[206167]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162917.7339756-660-73664365490608/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:38 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:08:38 compute-2 sudo[206165]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:38 compute-2 ceph-mon[75771]: pgmap v453: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:39.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:39 compute-2 sudo[206317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqdrwwpgzgzsduwqezrmecyxulcyykjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162919.06429-705-69327069959237/AnsiballZ_command.py'
Jan 23 10:08:39 compute-2 sudo[206317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:39 compute-2 python3.9[206319]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:08:39 compute-2 sudo[206317]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:39 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:39 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f800095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:40 compute-2 sudo[206471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nthdvsilxaoffvsrfntrxkyuqexrvwde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162919.743586-729-251307523704250/AnsiballZ_lineinfile.py'
Jan 23 10:08:40 compute-2 sudo[206471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:40 compute-2 python3.9[206473]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:40.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:40 compute-2 sudo[206471]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:40 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f800095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:40 compute-2 sudo[206624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjgragvhgxzaxeadamhcbscmluayqjan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162920.441481-753-183232159529159/AnsiballZ_replace.py'
Jan 23 10:08:40 compute-2 sudo[206624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:41 compute-2 sudo[206627]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:08:41 compute-2 sudo[206627]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:08:41 compute-2 sudo[206627]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:08:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:41.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:08:41 compute-2 python3.9[206626]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:41 compute-2 sudo[206624]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:41 compute-2 sudo[206801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpbtiwdfnszdmylueoevmmsotimeebxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162921.3172765-777-199846549266160/AnsiballZ_replace.py'
Jan 23 10:08:41 compute-2 sudo[206801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:41 compute-2 ceph-mon[75771]: pgmap v454: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:41 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:41 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:41 compute-2 python3.9[206803]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:41 compute-2 sudo[206801]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:08:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:42.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:08:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:42 compute-2 sudo[206955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tceqsmhnrygpumjarownowwfgjrtkfcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162922.1493685-804-18020165734679/AnsiballZ_lineinfile.py'
Jan 23 10:08:42 compute-2 sudo[206955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:42 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:42 compute-2 python3.9[206957]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:42 compute-2 sudo[206955]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:42 compute-2 sudo[207107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkdfukmzivbponvxntgibflkxtgedinf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162922.735581-804-102538609422747/AnsiballZ_lineinfile.py'
Jan 23 10:08:43 compute-2 sudo[207107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:43.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:43 compute-2 python3.9[207109]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:43 compute-2 sudo[207107]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:43 compute-2 ceph-mon[75771]: pgmap v455: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:43 compute-2 sudo[207259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdsjoxrwkxtkhyubqeuusrkpzjadvykt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162923.3086226-804-182072693069846/AnsiballZ_lineinfile.py'
Jan 23 10:08:43 compute-2 sudo[207259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:43 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:08:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:43 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f800095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:43 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:43 compute-2 python3.9[207261]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:43 compute-2 sudo[207259]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:44 compute-2 sudo[207412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzdazhfgemlphjevrwsolkyraygehlsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162923.8529167-804-9782558953355/AnsiballZ_lineinfile.py'
Jan 23 10:08:44 compute-2 sudo[207412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:08:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:44.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:08:44 compute-2 python3.9[207414]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:44 compute-2 sudo[207412]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:44 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:45.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:45 compute-2 sudo[207565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmcycscvfsqydcuscsngjpyeabpwsdeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162924.8125353-891-12077034556095/AnsiballZ_stat.py'
Jan 23 10:08:45 compute-2 sudo[207565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:45 compute-2 python3.9[207567]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:08:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:45 compute-2 sudo[207565]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:45 compute-2 ceph-mon[75771]: pgmap v456: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:45 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:45 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f800095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:45 compute-2 sudo[207719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgitiuyzoupvcfhuzfxghuabwykasbsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162925.4975603-915-19229929164266/AnsiballZ_command.py'
Jan 23 10:08:45 compute-2 sudo[207719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:45 compute-2 python3.9[207721]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:08:45 compute-2 sudo[207719]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:46.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:46 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:46 compute-2 sudo[207874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqhcjufdxxdhbwklctprbgwhxmqnndwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162926.2811577-942-15024490914608/AnsiballZ_systemd_service.py'
Jan 23 10:08:46 compute-2 sudo[207874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:46 compute-2 python3.9[207876]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:08:46 compute-2 systemd[1]: Listening on multipathd control socket.
Jan 23 10:08:46 compute-2 sudo[207874]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:47.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:47 compute-2 sudo[208030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rivkvlwpvjxqqcvehlrrndueqpteiqty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162927.1888406-966-104262967179343/AnsiballZ_systemd_service.py'
Jan 23 10:08:47 compute-2 sudo[208030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:47 compute-2 ceph-mon[75771]: pgmap v457: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:08:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:47 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:47 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:47 compute-2 python3.9[208032]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:08:47 compute-2 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 23 10:08:47 compute-2 udevadm[208037]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 23 10:08:47 compute-2 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 23 10:08:47 compute-2 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 23 10:08:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:47 compute-2 multipathd[208040]: --------start up--------
Jan 23 10:08:47 compute-2 multipathd[208040]: read /etc/multipath.conf
Jan 23 10:08:47 compute-2 multipathd[208040]: path checkers start up
Jan 23 10:08:47 compute-2 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 23 10:08:47 compute-2 sudo[208030]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:08:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:48.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:08:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:48 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f8000a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:48 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:08:48 compute-2 ceph-mon[75771]: pgmap v458: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:48 compute-2 sudo[208199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxieiwhhjzruhzxgjpizvkbytmokwxve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162928.5102892-1002-79420630704924/AnsiballZ_file.py'
Jan 23 10:08:48 compute-2 sudo[208199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:48 compute-2 python3.9[208201]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 23 10:08:48 compute-2 sudo[208199]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:49.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:49 compute-2 sudo[208351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fneuqvckpmmlbzxvxtkbvceeuxboarri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162929.1934009-1026-274973035846182/AnsiballZ_modprobe.py'
Jan 23 10:08:49 compute-2 sudo[208351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:49 compute-2 python3.9[208353]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 23 10:08:49 compute-2 kernel: Key type psk registered
Jan 23 10:08:49 compute-2 sudo[208351]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:49 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:49 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:50.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:50 compute-2 sudo[208513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkmnkzxaqdqzbcsvhneowuaoljfpbacg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162929.9599311-1050-89432496677408/AnsiballZ_stat.py'
Jan 23 10:08:50 compute-2 sudo[208513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:08:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:50 compute-2 python3.9[208515]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:08:50 compute-2 sudo[208513]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:50 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:50 compute-2 sudo[208637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwzfopuulvzhguzhonuooazjktmnmfpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162929.9599311-1050-89432496677408/AnsiballZ_copy.py'
Jan 23 10:08:50 compute-2 sudo[208637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:50 compute-2 python3.9[208639]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162929.9599311-1050-89432496677408/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:50 compute-2 sudo[208637]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:08:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:51.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:08:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:51 compute-2 ceph-mon[75771]: pgmap v459: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:51 compute-2 sudo[208789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipctmfxegbbjochvmefrhhhbpowzuxju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162931.2623491-1098-197595142735875/AnsiballZ_lineinfile.py'
Jan 23 10:08:51 compute-2 sudo[208789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:51 compute-2 python3.9[208791]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:51 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f8000a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:51 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:51 compute-2 sudo[208789]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:52 compute-2 sudo[208942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-weynglxdexcsskhmejcckkzbldwanzia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162931.9613948-1122-209910228381123/AnsiballZ_systemd.py'
Jan 23 10:08:52 compute-2 sudo[208942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:08:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:52.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:08:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:52 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:52 compute-2 python3.9[208944]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 10:08:52 compute-2 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 23 10:08:52 compute-2 systemd[1]: Stopped Load Kernel Modules.
Jan 23 10:08:52 compute-2 systemd[1]: Stopping Load Kernel Modules...
Jan 23 10:08:52 compute-2 systemd[1]: Starting Load Kernel Modules...
Jan 23 10:08:52 compute-2 systemd[1]: Finished Load Kernel Modules.
Jan 23 10:08:52 compute-2 sudo[208942]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:53.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:53 compute-2 sudo[209099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxzzqzquqrmzzumslsfpmkxjdlagjvem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162932.8952734-1146-202734293559359/AnsiballZ_dnf.py'
Jan 23 10:08:53 compute-2 sudo[209099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:53 compute-2 python3.9[209101]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 10:08:53 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:08:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:53 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:53 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f8000a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:53 compute-2 ceph-mon[75771]: pgmap v460: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:54.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:54 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:54 compute-2 podman[209105]: 2026-01-23 10:08:54.682876386 +0000 UTC m=+0.105889215 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller)
Jan 23 10:08:54 compute-2 ceph-mon[75771]: pgmap v461: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:55.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:08:55.471 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:08:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:08:55.472 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:08:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:08:55.472 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:08:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:55 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:55 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:55 compute-2 systemd[1]: Reloading.
Jan 23 10:08:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:55 compute-2 systemd-rc-local-generator[209161]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:08:55 compute-2 systemd-sysv-generator[209165]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:08:56 compute-2 systemd[1]: Reloading.
Jan 23 10:08:56 compute-2 systemd-sysv-generator[209201]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:08:56 compute-2 systemd-rc-local-generator[209197]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:08:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:08:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:56.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:08:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:56 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f8000a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:56 compute-2 systemd-logind[786]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 23 10:08:56 compute-2 systemd-logind[786]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 23 10:08:56 compute-2 lvm[209246]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 10:08:56 compute-2 lvm[209246]: VG ceph_vg0 finished
Jan 23 10:08:56 compute-2 ceph-mon[75771]: pgmap v462: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:08:56 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 10:08:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:56 compute-2 systemd[1]: Starting man-db-cache-update.service...
Jan 23 10:08:56 compute-2 systemd[1]: Reloading.
Jan 23 10:08:56 compute-2 systemd-sysv-generator[209301]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:08:56 compute-2 systemd-rc-local-generator[209298]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:08:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:57.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:57 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 10:08:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:57 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:57 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:57 compute-2 sudo[209099]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:58.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:58 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:58 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 10:08:58 compute-2 systemd[1]: Finished man-db-cache-update.service.
Jan 23 10:08:58 compute-2 systemd[1]: man-db-cache-update.service: Consumed 1.461s CPU time.
Jan 23 10:08:58 compute-2 systemd[1]: run-r372e5d1c1d7046f8b4804c42f3fe1e95.service: Deactivated successfully.
Jan 23 10:08:58 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:08:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:08:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:59.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:08:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100859 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:08:59 compute-2 podman[210476]: 2026-01-23 10:08:59.649676043 +0000 UTC m=+0.078147579 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 10:08:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:59 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:59 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:59 compute-2 ceph-mon[75771]: pgmap v463: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.003000073s ======
Jan 23 10:09:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:00.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000073s
Jan 23 10:09:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:00 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:01.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:01 compute-2 sudo[210598]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:09:01 compute-2 sudo[210598]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:09:01 compute-2 sudo[210598]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:01 compute-2 sudo[210650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzyojjrlxrgsimfvtashperrcovfksrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162940.8816125-1170-226127102192843/AnsiballZ_systemd_service.py'
Jan 23 10:09:01 compute-2 sudo[210650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:01 compute-2 python3.9[210652]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 10:09:01 compute-2 iscsid[204018]: iscsid shutting down.
Jan 23 10:09:01 compute-2 systemd[1]: Stopping Open-iSCSI...
Jan 23 10:09:01 compute-2 systemd[1]: iscsid.service: Deactivated successfully.
Jan 23 10:09:01 compute-2 systemd[1]: Stopped Open-iSCSI.
Jan 23 10:09:01 compute-2 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 23 10:09:01 compute-2 systemd[1]: Starting Open-iSCSI...
Jan 23 10:09:01 compute-2 systemd[1]: Started Open-iSCSI.
Jan 23 10:09:01 compute-2 sudo[210650]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:01 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:01 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f70001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:02 compute-2 ceph-mon[75771]: pgmap v464: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:09:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:02.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:02 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:02 compute-2 sudo[210683]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:09:02 compute-2 sudo[210683]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:09:02 compute-2 sudo[210683]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:02 compute-2 sudo[210721]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:09:02 compute-2 sudo[210721]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:09:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:03.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:03 compute-2 ceph-mon[75771]: pgmap v465: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:09:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:03 compute-2 sudo[210877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mixjvafrpsdhpsfxdtmvarfbyykdpvzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162942.8135622-1194-123467942438545/AnsiballZ_systemd_service.py'
Jan 23 10:09:03 compute-2 sudo[210877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:03 compute-2 sudo[210721]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:03 compute-2 python3.9[210879]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 10:09:03 compute-2 multipathd[208040]: exit (signal)
Jan 23 10:09:03 compute-2 multipathd[208040]: --------shut down-------
Jan 23 10:09:03 compute-2 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 23 10:09:03 compute-2 systemd[1]: multipathd.service: Deactivated successfully.
Jan 23 10:09:03 compute-2 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 23 10:09:03 compute-2 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 23 10:09:03 compute-2 multipathd[210897]: --------start up--------
Jan 23 10:09:03 compute-2 multipathd[210897]: read /etc/multipath.conf
Jan 23 10:09:03 compute-2 multipathd[210897]: path checkers start up
Jan 23 10:09:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:03 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:03 compute-2 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 23 10:09:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:03 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:03 compute-2 sudo[210877]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:03 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:09:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:04.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:04 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f70001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:04 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:09:04 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:09:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:04 compute-2 python3.9[211056]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 10:09:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:05.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:05 compute-2 ceph-mon[75771]: pgmap v466: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:09:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:09:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:09:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:09:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:09:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:09:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:09:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:09:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:09:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:05 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:05 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:05 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Jan 23 10:09:05 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:05.927611) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:09:05 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Jan 23 10:09:05 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162945927859, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 1320, "num_deletes": 260, "total_data_size": 3219343, "memory_usage": 3265056, "flush_reason": "Manual Compaction"}
Jan 23 10:09:05 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Jan 23 10:09:05 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162945942148, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 2098879, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18059, "largest_seqno": 19374, "table_properties": {"data_size": 2093326, "index_size": 2947, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11386, "raw_average_key_size": 18, "raw_value_size": 2082056, "raw_average_value_size": 3402, "num_data_blocks": 132, "num_entries": 612, "num_filter_entries": 612, "num_deletions": 260, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162834, "oldest_key_time": 1769162834, "file_creation_time": 1769162945, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:09:05 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 14578 microseconds, and 6138 cpu microseconds.
Jan 23 10:09:05 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:09:05 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:05.942251) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 2098879 bytes OK
Jan 23 10:09:05 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:05.942286) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Jan 23 10:09:05 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:05.944061) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Jan 23 10:09:05 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:05.944089) EVENT_LOG_v1 {"time_micros": 1769162945944084, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:09:05 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:05.944113) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:09:05 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 3213119, prev total WAL file size 3213119, number of live WAL files 2.
Jan 23 10:09:05 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:09:05 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:05.945372) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323536' seq:0, type:0; will stop at (end)
Jan 23 10:09:05 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:09:05 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(2049KB)], [33(11MB)]
Jan 23 10:09:05 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162945945577, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 13884527, "oldest_snapshot_seqno": -1}
Jan 23 10:09:06 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 4954 keys, 13427325 bytes, temperature: kUnknown
Jan 23 10:09:06 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162946033892, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 13427325, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13392414, "index_size": 21425, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12421, "raw_key_size": 126088, "raw_average_key_size": 25, "raw_value_size": 13300515, "raw_average_value_size": 2684, "num_data_blocks": 881, "num_entries": 4954, "num_filter_entries": 4954, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769162945, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:09:06 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:09:06 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:06.034127) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 13427325 bytes
Jan 23 10:09:06 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:06.035788) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 157.1 rd, 151.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 11.2 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(13.0) write-amplify(6.4) OK, records in: 5488, records dropped: 534 output_compression: NoCompression
Jan 23 10:09:06 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:06.035809) EVENT_LOG_v1 {"time_micros": 1769162946035796, "job": 18, "event": "compaction_finished", "compaction_time_micros": 88379, "compaction_time_cpu_micros": 29172, "output_level": 6, "num_output_files": 1, "total_output_size": 13427325, "num_input_records": 5488, "num_output_records": 4954, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:09:06 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:09:06 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162946036662, "job": 18, "event": "table_file_deletion", "file_number": 35}
Jan 23 10:09:06 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:09:06 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162946038957, "job": 18, "event": "table_file_deletion", "file_number": 33}
Jan 23 10:09:06 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:05.945121) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:09:06 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:06.039119) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:09:06 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:06.039127) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:09:06 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:06.039129) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:09:06 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:06.039131) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:09:06 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:06.039133) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:09:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:06.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:06 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:06 compute-2 sudo[211212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvgvyfyomfawnwgfloqmjpatgliizknj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162945.9233012-1246-122063536865147/AnsiballZ_file.py'
Jan 23 10:09:06 compute-2 sudo[211212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:06 compute-2 python3.9[211214]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:06 compute-2 sudo[211212]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:06 compute-2 ceph-mon[75771]: pgmap v467: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 853 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:09:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:07.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:07 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f70001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:07 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:07 compute-2 sudo[211364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnlfwueojhokdescysvrvwujplrryvwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162947.6776915-1279-25605655540932/AnsiballZ_systemd_service.py'
Jan 23 10:09:07 compute-2 sudo[211364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:08 compute-2 python3.9[211367]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 10:09:08 compute-2 systemd[1]: Reloading.
Jan 23 10:09:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:09:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:08.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:09:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:08 compute-2 systemd-rc-local-generator[211396]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:09:08 compute-2 systemd-sysv-generator[211399]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:09:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:08 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:08 compute-2 sudo[211364]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:08 : epoch 69734882 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:09:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:08 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:09:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:09.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:09 compute-2 python3.9[211554]: ansible-ansible.builtin.service_facts Invoked
Jan 23 10:09:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:09 compute-2 network[211571]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 10:09:09 compute-2 network[211572]: 'network-scripts' will be removed from distribution in near future.
Jan 23 10:09:09 compute-2 network[211573]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 10:09:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:09 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:09 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f70001ab0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:09 compute-2 ceph-mon[75771]: pgmap v468: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:09:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:09:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:10.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:09:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:10 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:09:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:11.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:09:11 compute-2 sudo[211647]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:09:11 compute-2 sudo[211647]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:09:11 compute-2 sudo[211647]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:11 compute-2 ceph-mon[75771]: pgmap v469: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:09:11 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:09:11 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:09:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:11 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58003870 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:11 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:11 : epoch 69734882 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:09:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:11 : epoch 69734882 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:09:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:12.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:12 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f70003390 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:13.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:13 compute-2 ceph-mon[75771]: pgmap v470: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 10:09:13 compute-2 sudo[211873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oaevmvqjaaodnakvnocrwqjckdocawsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162953.4423828-1336-130052259418678/AnsiballZ_systemd_service.py'
Jan 23 10:09:13 compute-2 sudo[211873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:13 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:13 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58003870 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:13 compute-2 python3.9[211875]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:09:13 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:09:14 compute-2 sudo[211873]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:14.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:14 compute-2 sudo[212028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxeztfbndnvlriyikzyxvinstbyhiutu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162954.1320436-1336-66444162321970/AnsiballZ_systemd_service.py'
Jan 23 10:09:14 compute-2 sudo[212028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:14 compute-2 python3.9[212030]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:09:14 compute-2 sudo[212028]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:09:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:15 compute-2 sudo[212181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mduzcxgxjlkpbmtqufdcqofmtdhcygdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162954.8442369-1336-110533937170707/AnsiballZ_systemd_service.py'
Jan 23 10:09:15 compute-2 sudo[212181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:15.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:15 compute-2 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 23 10:09:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:15 compute-2 python3.9[212183]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:09:15 compute-2 sudo[212181]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:15 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:15 compute-2 ceph-mon[75771]: pgmap v471: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 10:09:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:15 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:15 compute-2 sudo[212335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlnkkuaraskqskprwcmzmuyxxnahmkaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162955.5788846-1336-19311929245622/AnsiballZ_systemd_service.py'
Jan 23 10:09:15 compute-2 sudo[212335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:16 compute-2 python3.9[212337]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:09:16 compute-2 sudo[212335]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:09:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:16.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:09:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:16 compute-2 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 23 10:09:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:16 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58003870 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:16 compute-2 sudo[212491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfvyfazwvxwbblwxqolwmwmxjqnrijqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162956.3743188-1336-17038641745046/AnsiballZ_systemd_service.py'
Jan 23 10:09:16 compute-2 sudo[212491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:16 compute-2 python3.9[212493]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:09:16 compute-2 ceph-mon[75771]: pgmap v472: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:09:17 compute-2 sudo[212491]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:17.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:17 compute-2 sudo[212644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aixmwumyufxccsdxdlqgaprljykwjkqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162957.1236053-1336-262863908578861/AnsiballZ_systemd_service.py'
Jan 23 10:09:17 compute-2 sudo[212644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:17 compute-2 python3.9[212646]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:09:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:17 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f70003390 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:17 compute-2 sudo[212644]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:17 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:18 compute-2 sudo[212798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmdllvuyjzcattzqlmawtcjumowppqpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162957.868257-1336-246877566487424/AnsiballZ_systemd_service.py'
Jan 23 10:09:18 compute-2 sudo[212798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:18.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:18 compute-2 python3.9[212800]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:09:18 compute-2 sudo[212798]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:18 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:18 compute-2 sudo[212952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyofmbngzwmpdziufqtmkvozehhokign ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162958.5365877-1336-243830551309066/AnsiballZ_systemd_service.py'
Jan 23 10:09:18 compute-2 sudo[212952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:18 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:09:19 compute-2 python3.9[212954]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:09:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:19.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:19 compute-2 sudo[212952]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:19 compute-2 ceph-mon[75771]: pgmap v473: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:09:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:19 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:19 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f700040a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:20.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:20 compute-2 sudo[213106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nztloknaiickcupfbdbiscigerpmrvyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162960.0140605-1512-244665057060887/AnsiballZ_file.py'
Jan 23 10:09:20 compute-2 sudo[213106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:20 compute-2 python3.9[213108]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:20 compute-2 sudo[213106]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:20 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:09:20 compute-2 sudo[213259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knblfsnnlecdqxobcrzaptjudosxapnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162960.5835302-1512-169269537039648/AnsiballZ_file.py'
Jan 23 10:09:20 compute-2 sudo[213259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:21 compute-2 python3.9[213261]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:21 compute-2 sudo[213259]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:09:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:21.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:09:21 compute-2 sudo[213291]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:09:21 compute-2 sudo[213291]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:09:21 compute-2 sudo[213291]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100921 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:09:21 compute-2 sudo[213436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cngyllwecynuldyzdcasqpxkttxhpivf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162961.170153-1512-13703581336995/AnsiballZ_file.py'
Jan 23 10:09:21 compute-2 sudo[213436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:21 compute-2 python3.9[213438]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:21 compute-2 sudo[213436]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:21 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:21 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:21 compute-2 ceph-mon[75771]: pgmap v474: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:09:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:21 compute-2 sudo[213589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsxachlwzarcbnnmfidjqcfzhwrkomgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162961.7474408-1512-114272836082578/AnsiballZ_file.py'
Jan 23 10:09:21 compute-2 sudo[213589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:22 compute-2 python3.9[213591]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:22 compute-2 sudo[213589]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:22.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:22 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:22 compute-2 sudo[213742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkodsihjznbahgvdnaadtvowvdqvhflf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162962.3410296-1512-171121032390672/AnsiballZ_file.py'
Jan 23 10:09:22 compute-2 sudo[213742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:22 compute-2 python3.9[213744]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:22 compute-2 sudo[213742]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:22 compute-2 ceph-mon[75771]: pgmap v475: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:09:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:09:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:23.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:09:23 compute-2 sudo[213894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpkvyawamdxsnmjmzyvhkmgwmtbhvwwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162962.9513288-1512-62460844801630/AnsiballZ_file.py'
Jan 23 10:09:23 compute-2 sudo[213894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:23 compute-2 python3.9[213896]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:23 compute-2 sudo[213894]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:23 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:23 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:23 compute-2 sudo[214046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktlfczxrqlialgeercsamynrdhibretb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162963.5942934-1512-232837592975696/AnsiballZ_file.py'
Jan 23 10:09:23 compute-2 sudo[214046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:23 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:09:24 compute-2 python3.9[214048]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:24 compute-2 sudo[214046]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:09:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:24.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:09:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:24 compute-2 sudo[214200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enqgljmbgtkmlqrhsfyykefzxldnkuys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162964.153283-1512-156455144603129/AnsiballZ_file.py'
Jan 23 10:09:24 compute-2 sudo[214200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:24 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f700040a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:24 compute-2 python3.9[214202]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:24 compute-2 sudo[214200]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:25.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:25 compute-2 sudo[214366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfksycnzhcwpcvbxdspqaqbhpoastloz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162965.0200086-1684-261862288814074/AnsiballZ_file.py'
Jan 23 10:09:25 compute-2 sudo[214366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:25 compute-2 podman[214326]: 2026-01-23 10:09:25.388755943 +0000 UTC m=+0.100376217 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 23 10:09:25 compute-2 python3.9[214371]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:25 compute-2 sudo[214366]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:25 compute-2 ceph-mon[75771]: pgmap v476: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 23 10:09:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:25 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:25 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:25 compute-2 sudo[214531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbgpbfftqyqutykufrwothezhfpcdfhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162965.696225-1684-26783445286336/AnsiballZ_file.py'
Jan 23 10:09:25 compute-2 sudo[214531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:26 compute-2 python3.9[214533]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:26 compute-2 sudo[214531]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:09:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:26.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:09:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:26 compute-2 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 23 10:09:26 compute-2 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 23 10:09:26 compute-2 sudo[214687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmdtogpmmyoxgibwbhjlceaptqlywwfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162966.2121146-1684-120795265462226/AnsiballZ_file.py'
Jan 23 10:09:26 compute-2 sudo[214687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:26 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:26 compute-2 python3.9[214689]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:26 compute-2 sudo[214687]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:26 compute-2 sudo[214839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfqrfasmtqielcsqhxlurtnckwrikkek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162966.7249818-1684-243148315637452/AnsiballZ_file.py'
Jan 23 10:09:26 compute-2 sudo[214839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:27.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:27 compute-2 python3.9[214841]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:27 compute-2 sudo[214839]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:27 compute-2 sudo[214991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcoxnabhybomztbqlobpivrievbvqkhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162967.2899568-1684-133813585306339/AnsiballZ_file.py'
Jan 23 10:09:27 compute-2 sudo[214991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:27 compute-2 ceph-mon[75771]: pgmap v477: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Jan 23 10:09:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:27 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f700040a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:27 compute-2 python3.9[214993]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:27 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:27 compute-2 sudo[214991]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:28 compute-2 sudo[215144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmqkjmdffnpglzpfkufjuuhixmqydiqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162967.8916461-1684-159021778316509/AnsiballZ_file.py'
Jan 23 10:09:28 compute-2 sudo[215144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:28.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:28 compute-2 python3.9[215146]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:28 compute-2 sudo[215144]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:28 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:28 compute-2 sudo[215297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yeyuzaxslpdlhepiegjtntwfavfykbre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162968.4400284-1684-92650121819248/AnsiballZ_file.py'
Jan 23 10:09:28 compute-2 sudo[215297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:28 compute-2 python3.9[215299]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:28 compute-2 sudo[215297]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:28 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:09:29 compute-2 ceph-mon[75771]: pgmap v478: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:09:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:09:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:29.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:09:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:29 compute-2 sudo[215449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvacobdzrqxnwlprmiywwzebkjetvhvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162968.9878595-1684-122259844626506/AnsiballZ_file.py'
Jan 23 10:09:29 compute-2 sudo[215449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:29 compute-2 podman[215451]: 2026-01-23 10:09:29.738553103 +0000 UTC m=+0.048342662 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 23 10:09:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:29 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:29 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f700040a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:29 compute-2 python3.9[215452]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:29 compute-2 sudo[215449]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:30.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:30 compute-2 sudo[215622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcdrkbfcwwbeumrspxhgosykeozwrkjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162970.2262123-1858-137672461857746/AnsiballZ_command.py'
Jan 23 10:09:30 compute-2 sudo[215622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:30 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:30 compute-2 python3.9[215624]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:09:30 compute-2 sudo[215622]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:31.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:31 compute-2 ceph-mon[75771]: pgmap v479: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:09:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:31 compute-2 python3.9[215776]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 10:09:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:31 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:31 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:32 compute-2 sudo[215929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfvldzfpaaxxztufihzfkypcnmxgxyfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162971.8526623-1912-155162102037522/AnsiballZ_systemd_service.py'
Jan 23 10:09:32 compute-2 sudo[215929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:32.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:32 compute-2 python3.9[215931]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 10:09:32 compute-2 systemd[1]: Reloading.
Jan 23 10:09:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:32 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f54000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:32 compute-2 systemd-sysv-generator[215961]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:09:32 compute-2 systemd-rc-local-generator[215958]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:09:32 compute-2 sudo[215929]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:33.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:33 compute-2 sudo[216117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcsunqcgtgwvieyrvawthkefxsewaajf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162973.0069556-1936-109218922073456/AnsiballZ_command.py'
Jan 23 10:09:33 compute-2 sudo[216117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:33 compute-2 ceph-mon[75771]: pgmap v480: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:09:33 compute-2 python3.9[216119]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:09:33 compute-2 sudo[216117]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:33 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:33 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80001e60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:33 compute-2 sudo[216270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifoqylunxijzjqwkukxybfjcxhulnbey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162973.5978196-1936-95699553042481/AnsiballZ_command.py'
Jan 23 10:09:33 compute-2 sudo[216270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:33 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:09:34 compute-2 python3.9[216272]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:09:34 compute-2 sudo[216270]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:34.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:34 compute-2 sudo[216425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jonrjdgijvozkkzapqbmjmcobmgnqyan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162974.189044-1936-234238062955316/AnsiballZ_command.py'
Jan 23 10:09:34 compute-2 sudo[216425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:34 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:34 compute-2 python3.9[216427]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:09:34 compute-2 sudo[216425]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:35 compute-2 sudo[216578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbirfqialudodmyuluyxgqmqltljlnwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162974.7919629-1936-256443943315888/AnsiballZ_command.py'
Jan 23 10:09:35 compute-2 sudo[216578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:35.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:35 compute-2 python3.9[216580]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:09:35 compute-2 sudo[216578]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:35 compute-2 ceph-mon[75771]: pgmap v481: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:09:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:09:35 compute-2 sudo[216731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqeinkxiyclbdohmcbfkscqrswrofhgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162975.3933954-1936-70754891852323/AnsiballZ_command.py'
Jan 23 10:09:35 compute-2 sudo[216731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:35 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f540016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:35 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f540016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:35 compute-2 python3.9[216733]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:09:35 compute-2 sudo[216731]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:36.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:36 compute-2 sudo[216886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lormqxrlbffznsnufzvykcmncqtpklod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162976.1030428-1936-178591884272172/AnsiballZ_command.py'
Jan 23 10:09:36 compute-2 sudo[216886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:36 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80001e60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:36 compute-2 python3.9[216888]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:09:36 compute-2 sudo[216886]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:37 compute-2 sudo[217039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqptkhjmbszlwblntqugbximeiiuynpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162976.8108509-1936-1954989901676/AnsiballZ_command.py'
Jan 23 10:09:37 compute-2 sudo[217039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:09:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:37.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:09:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:37 compute-2 python3.9[217041]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:09:37 compute-2 ceph-mon[75771]: pgmap v482: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:09:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:37 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:37 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f540016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:38.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:38 compute-2 sudo[217039]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:38 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f540016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:38 compute-2 sudo[217194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acezxoywrzsjitprobifeqdrntdcixbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162978.4614809-1936-50678349020526/AnsiballZ_command.py'
Jan 23 10:09:38 compute-2 sudo[217194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:38 compute-2 ceph-mon[75771]: pgmap v483: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:09:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:38 compute-2 python3.9[217196]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:09:38 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:09:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:39.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:39 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80001e60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:39 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:40 compute-2 sudo[217194]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:40.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:40 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f540016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:41.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:41 compute-2 sudo[217249]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:09:41 compute-2 sudo[217249]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:09:41 compute-2 sudo[217249]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:41 compute-2 sudo[217374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbelzbznrubfqxqfxllajtbbpxgvffgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162981.1850793-2143-79848458657387/AnsiballZ_file.py'
Jan 23 10:09:41 compute-2 sudo[217374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:41 compute-2 ceph-mon[75771]: pgmap v484: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:09:41 compute-2 python3.9[217376]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:09:41 compute-2 sudo[217374]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:41 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f540016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:41 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80001e60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:42.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:42 compute-2 sudo[217527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukcarwzpocgetlukpkaumqycwbcqvxun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162981.853502-2143-278640307075028/AnsiballZ_file.py'
Jan 23 10:09:42 compute-2 sudo[217527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:42 compute-2 python3.9[217530]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:09:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:42 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:42 compute-2 sudo[217527]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:42 compute-2 sudo[217680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvbkswwoygjqhssulxrlbrtdzbghvkll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162982.6677742-2143-108692065725116/AnsiballZ_file.py'
Jan 23 10:09:42 compute-2 sudo[217680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:43 compute-2 python3.9[217682]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:09:43 compute-2 sudo[217680]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:09:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:43.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:09:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:43 compute-2 sudo[217832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwjvsocnrmmqapssxtvbwjnxsawhwoec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162983.397942-2209-93344143010852/AnsiballZ_file.py'
Jan 23 10:09:43 compute-2 sudo[217832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:43 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:43 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f540016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:43 compute-2 python3.9[217834]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:09:43 compute-2 sudo[217832]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:43 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:09:44 compute-2 ceph-mon[75771]: pgmap v485: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:09:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:09:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:44.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:09:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:44 compute-2 sudo[217986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reodkcjpdmsxrayamutzjopycprdzjqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162984.052122-2209-1230874361040/AnsiballZ_file.py'
Jan 23 10:09:44 compute-2 sudo[217986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:44 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80001e60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:44 compute-2 python3.9[217988]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:09:44 compute-2 sudo[217986]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:44 compute-2 sudo[218138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjuscaragtnvcutgqfgxbsfihareqxgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162984.7018213-2209-220927386167094/AnsiballZ_file.py'
Jan 23 10:09:44 compute-2 sudo[218138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:45 compute-2 ceph-mon[75771]: pgmap v486: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:09:45 compute-2 python3.9[218140]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:09:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:45.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:45 compute-2 sudo[218138]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:45 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:45 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f580045a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:45 compute-2 sudo[218290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-piofbkiazcmorvmfirmykhftflftevmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162985.5742218-2209-122646802498233/AnsiballZ_file.py'
Jan 23 10:09:45 compute-2 sudo[218290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:46 compute-2 python3.9[218292]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:09:46 compute-2 sudo[218290]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:46.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:46 compute-2 sudo[218444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylaebhwqfjqipwvptyjllxqfgqvhfosa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162986.2308424-2209-163045300906725/AnsiballZ_file.py'
Jan 23 10:09:46 compute-2 sudo[218444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:46 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f540036e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:46 compute-2 python3.9[218446]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:09:46 compute-2 sudo[218444]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:47 compute-2 sudo[218596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnwtyieglactyvpzzffggjwugsixatul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162986.8554683-2209-161866668425482/AnsiballZ_file.py'
Jan 23 10:09:47 compute-2 sudo[218596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:47.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:47 compute-2 python3.9[218598]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:09:47 compute-2 sudo[218596]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:47 compute-2 ceph-mon[75771]: pgmap v487: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:09:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:47 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:47 compute-2 sudo[218748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozvwtfnxwobctlovrvtgwvjdaodvwgdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162987.482739-2209-109973873805315/AnsiballZ_file.py'
Jan 23 10:09:47 compute-2 sudo[218748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:47 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:47 compute-2 python3.9[218750]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:09:47 compute-2 sudo[218748]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:48.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:48 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f580045c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:48 compute-2 ceph-mon[75771]: pgmap v488: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:09:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:48 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:09:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:09:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:49.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:09:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:49 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f540036e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:49 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:09:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:09:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:50.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:09:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:50 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:51 compute-2 ceph-mon[75771]: pgmap v489: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:09:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:09:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:51.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:09:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:51 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:51 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:09:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:52.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:09:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:52 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f74001110 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:09:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:53.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:09:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:53 compute-2 ceph-mon[75771]: pgmap v490: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:09:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:53 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:53 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:53 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:09:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:54.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:54 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:54 compute-2 sudo[218910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-errwamybmgzhtorawjwybklhwhozxftb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162994.1714652-2533-101382985934374/AnsiballZ_getent.py'
Jan 23 10:09:54 compute-2 sudo[218910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:54 compute-2 python3.9[218912]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 23 10:09:54 compute-2 sudo[218910]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:55.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:09:55.472 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:09:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:09:55.473 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:09:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:09:55.474 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:09:55 compute-2 ceph-mon[75771]: pgmap v491: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:09:55 compute-2 sudo[219076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhmjpepqbhgbsctjgrflbhuevjtppomu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162995.139749-2558-14852983444577/AnsiballZ_group.py'
Jan 23 10:09:55 compute-2 sudo[219076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:55 compute-2 podman[219037]: 2026-01-23 10:09:55.687000365 +0000 UTC m=+0.124084792 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_controller)
Jan 23 10:09:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:55 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:55 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:55 compute-2 python3.9[219084]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 10:09:55 compute-2 groupadd[219092]: group added to /etc/group: name=nova, GID=42436
Jan 23 10:09:55 compute-2 groupadd[219092]: group added to /etc/gshadow: name=nova
Jan 23 10:09:55 compute-2 groupadd[219092]: new group: name=nova, GID=42436
Jan 23 10:09:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:55 compute-2 sudo[219076]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:09:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:56.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:09:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:56 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:56 compute-2 ceph-mon[75771]: pgmap v492: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:09:56 compute-2 sudo[219249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uugmyafwxynsxpsazxnolpjkiceafaft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162996.5046852-2581-11558005751050/AnsiballZ_user.py'
Jan 23 10:09:56 compute-2 sudo[219249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:09:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:57.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:09:57 compute-2 python3.9[219251]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 10:09:57 compute-2 useradd[219253]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Jan 23 10:09:57 compute-2 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 10:09:57 compute-2 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 10:09:57 compute-2 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 10:09:57 compute-2 useradd[219253]: add 'nova' to group 'libvirt'
Jan 23 10:09:57 compute-2 useradd[219253]: add 'nova' to shadow group 'libvirt'
Jan 23 10:09:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:57 compute-2 sudo[219249]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:57 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f54003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:57 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:09:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:58.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:09:58 compute-2 sshd-session[219286]: Accepted publickey for zuul from 192.168.122.30 port 33306 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 10:09:58 compute-2 systemd-logind[786]: New session 54 of user zuul.
Jan 23 10:09:58 compute-2 systemd[1]: Started Session 54 of User zuul.
Jan 23 10:09:58 compute-2 sshd-session[219286]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 10:09:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:58 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:58 compute-2 sshd-session[219290]: Received disconnect from 192.168.122.30 port 33306:11: disconnected by user
Jan 23 10:09:58 compute-2 sshd-session[219290]: Disconnected from user zuul 192.168.122.30 port 33306
Jan 23 10:09:58 compute-2 sshd-session[219286]: pam_unix(sshd:session): session closed for user zuul
Jan 23 10:09:58 compute-2 systemd[1]: session-54.scope: Deactivated successfully.
Jan 23 10:09:58 compute-2 systemd-logind[786]: Session 54 logged out. Waiting for processes to exit.
Jan 23 10:09:58 compute-2 systemd-logind[786]: Removed session 54.
Jan 23 10:09:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:58 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:09:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:09:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:59.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:59 compute-2 python3.9[219440]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:09:59 compute-2 ceph-mon[75771]: pgmap v493: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:09:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:59 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f74001eb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:59 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f54003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:09:59 compute-2 python3.9[219561]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162999.0075846-2657-58509827036090/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:10:00 compute-2 podman[219563]: 2026-01-23 10:10:00.025899092 +0000 UTC m=+0.058068097 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 10:10:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:10:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:00.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:10:00 compute-2 python3.9[219732]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:10:00 compute-2 ceph-mon[75771]: overall HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore
Jan 23 10:10:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:00 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0043f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:00 compute-2 python3.9[219808]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:10:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:01.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:01 compute-2 sudo[219959]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:10:01 compute-2 sudo[219959]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:10:01 compute-2 sudo[219959]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:01 compute-2 python3.9[219958]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:10:01 compute-2 ceph-mon[75771]: pgmap v494: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:01 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:01 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f74001eb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:01 compute-2 python3.9[220104]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769163001.0560858-2657-199495556742565/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:10:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:02.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:02 compute-2 python3.9[220256]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:10:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:02 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f54003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:03 compute-2 python3.9[220377]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769163002.1102335-2657-31445433639788/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=d01cc1b48d783e4ed08d12bb4d0a107aba230a69 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:10:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:03.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:03 compute-2 ceph-mon[75771]: pgmap v495: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:10:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:03 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0043f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:03 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:03 compute-2 python3.9[220527]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:10:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:03 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:10:04 compute-2 python3.9[220649]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769163003.1510732-2657-27368902594632/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:10:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:10:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:04.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:10:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:04 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f74002bc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:04 compute-2 python3.9[220800]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:10:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:10:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:05.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:10:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:05 compute-2 python3.9[220921]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769163004.4312184-2657-35097282766233/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:10:05 compute-2 ceph-mon[75771]: pgmap v496: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:10:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:05 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f54003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:05 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0043f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:06.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:06 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:06 compute-2 ceph-mon[75771]: pgmap v497: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:10:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:07 compute-2 sudo[221073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhgfumwtefecvjjwzjptvdvuryenjhdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163006.8213627-2906-92733740347841/AnsiballZ_file.py'
Jan 23 10:10:07 compute-2 sudo[221073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:10:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:07.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:10:07 compute-2 python3.9[221075]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:10:07 compute-2 sudo[221073]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:07 compute-2 sudo[221225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpkriorsdzjntlbnifsdaafzrihirfui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163007.499906-2930-34053983608495/AnsiballZ_copy.py'
Jan 23 10:10:07 compute-2 sudo[221225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:07 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f74002bc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:07 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f54003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:07 compute-2 python3.9[221227]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:10:07 compute-2 sudo[221225]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:08.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:08 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0043f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:08 compute-2 sudo[221379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwnlqplwcdiqkduurghwnqtufxmyutvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163008.3537862-2954-43716157811774/AnsiballZ_stat.py'
Jan 23 10:10:08 compute-2 sudo[221379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:08 compute-2 python3.9[221381]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:10:08 compute-2 sudo[221379]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:08 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:10:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:09.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:09 compute-2 sudo[221531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ophialfwddvezkdvxwjtrqknwapxngdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163009.0552902-2980-68720861896106/AnsiballZ_stat.py'
Jan 23 10:10:09 compute-2 sudo[221531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:09 compute-2 ceph-mon[75771]: pgmap v498: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:09 compute-2 python3.9[221533]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:10:09 compute-2 sudo[221531]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:09 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:09 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f740038d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:09 compute-2 sudo[221654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjcotdbsclodpxzdsbhixzieedzehtta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163009.0552902-2980-68720861896106/AnsiballZ_copy.py'
Jan 23 10:10:09 compute-2 sudo[221654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:09 compute-2 python3.9[221656]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769163009.0552902-2980-68720861896106/.source _original_basename=.qvr9pjna follow=False checksum=7ec3f985e290d2ef791d15595ca8f7c7030c8ec4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 23 10:10:10 compute-2 sudo[221654]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:10:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:10.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:10:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:10 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f54003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:10 compute-2 python3.9[221811]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:10:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:11.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:11 compute-2 sudo[221926]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:10:11 compute-2 sudo[221926]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:10:11 compute-2 sudo[221926]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:11 compute-2 sudo[221989]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:10:11 compute-2 sudo[221989]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:10:11 compute-2 ceph-mon[75771]: pgmap v499: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:11 compute-2 python3.9[221988]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:10:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:11 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0043f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:11 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:11 compute-2 sudo[221989]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:12 compute-2 python3.9[222158]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769163011.1817672-3056-124980180761381/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=53b8456782b81b5794d3eef3fadcfb00db1088a8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:10:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:12.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:12 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f740038d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:12 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:10:12 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:10:12 compute-2 ceph-mon[75771]: pgmap v500: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:10:12 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:10:12 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:10:12 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:10:12 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:10:12 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:10:12 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:10:12 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:10:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:12 compute-2 python3.9[222319]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:10:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:13.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:13 compute-2 python3.9[222440]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769163012.3389883-3101-258636953608352/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=0333d3a3f5c3a0526b0ebe430250032166710e8a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:10:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:13 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f740038d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:13 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0043f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:13 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:10:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:14.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:14 compute-2 sudo[222592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfxydrrghvvqhgpbfujadrqknpfalrxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163014.0144908-3152-222133228728492/AnsiballZ_container_config_data.py'
Jan 23 10:10:14 compute-2 sudo[222592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:14 compute-2 python3.9[222594]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 23 10:10:14 compute-2 sudo[222592]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:15.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:15 compute-2 sudo[222744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krzkznerejwccfexvdmuuvkqbwouzska ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163015.070813-3185-69188162427750/AnsiballZ_container_config_hash.py'
Jan 23 10:10:15 compute-2 sudo[222744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:15 compute-2 ceph-mon[75771]: pgmap v501: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:15 compute-2 python3.9[222746]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 10:10:15 compute-2 sudo[222744]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:15 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:15 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f54003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:10:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:16.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:10:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:16 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0043f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:16 compute-2 sudo[222898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwstpggnosuxzhdcsptvkbxssrthpetk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769163016.1064222-3215-211953398381096/AnsiballZ_edpm_container_manage.py'
Jan 23 10:10:16 compute-2 sudo[222898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:16 compute-2 ceph-mon[75771]: pgmap v502: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:10:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:16 compute-2 python3[222900]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 10:10:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:10:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:17.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:10:17 compute-2 sudo[222928]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:10:17 compute-2 sudo[222928]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:10:17 compute-2 sudo[222928]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:17 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:17 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f740038d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:18 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:10:18 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:10:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:10:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:18.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:10:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:18 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f54003a00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:18 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:10:19 compute-2 ceph-mon[75771]: pgmap v503: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:19.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:19 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0043f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:19 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:10:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:20.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:20 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f740038d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:10:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:21.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:10:21 compute-2 ceph-mon[75771]: pgmap v504: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:21 compute-2 sudo[223002]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:10:21 compute-2 sudo[223002]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:10:21 compute-2 sudo[223002]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:21 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f54003a20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:21 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0043f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:10:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:22.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:10:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:22 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:10:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:23.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:10:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:23 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f740038d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:23 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f54003a40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:23 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:10:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:10:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:24.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:10:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:24 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f580014d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:25.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:25 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:25 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f740038d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:26.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:26 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50002050 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:27.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:27 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f580014d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:27 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:28.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:28 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f740038d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:28 compute-2 podman[223051]: 2026-01-23 10:10:28.780583362 +0000 UTC m=+2.207214381 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 10:10:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:28 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:10:29 compute-2 podman[222915]: 2026-01-23 10:10:29.255692891 +0000 UTC m=+12.234540152 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Jan 23 10:10:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:29.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:29 compute-2 ceph-mon[75771]: pgmap v505: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:10:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:29 compute-2 podman[223102]: 2026-01-23 10:10:29.420849371 +0000 UTC m=+0.058015506 container create ab35065d5c90da8f59fd3b2ff3626e82eb030c0e98065938576df5f518cd313c (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, container_name=nova_compute_init, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 23 10:10:29 compute-2 podman[223102]: 2026-01-23 10:10:29.39109285 +0000 UTC m=+0.028259035 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Jan 23 10:10:29 compute-2 python3[222900]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 23 10:10:29 compute-2 sudo[222898]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:29 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50002050 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:29 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f580014d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:30 compute-2 podman[223265]: 2026-01-23 10:10:30.241543426 +0000 UTC m=+0.044128975 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 10:10:30 compute-2 sudo[223305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byidmjwcmbrnjbvqotytkhowbawngwdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163029.9867225-3239-217502751127997/AnsiballZ_stat.py'
Jan 23 10:10:30 compute-2 sudo[223305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:10:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:30.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:10:30 compute-2 ceph-mon[75771]: pgmap v506: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:30 compute-2 ceph-mon[75771]: pgmap v507: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:10:30 compute-2 ceph-mon[75771]: pgmap v508: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:30 compute-2 python3.9[223312]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:10:30 compute-2 sudo[223305]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:30 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f740038d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:10:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:31.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:10:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:31 compute-2 ceph-mon[75771]: pgmap v509: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:31 compute-2 sudo[223465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jespsrrqpkflahlpuyzwdijufcefnoiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163031.2108696-3275-54464064239409/AnsiballZ_container_config_data.py'
Jan 23 10:10:31 compute-2 sudo[223465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:31 compute-2 python3.9[223467]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 23 10:10:31 compute-2 sudo[223465]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:31 compute-2 kernel: ganesha.nfsd[218780]: segfault at 50 ip 00007f8000eeb32e sp 00007f7f7affc210 error 4 in libntirpc.so.5.8[7f8000ed0000+2c000] likely on CPU 4 (core 0, socket 4)
Jan 23 10:10:31 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 10:10:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:31 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f740038d0 fd 48 proxy ignored for local
Jan 23 10:10:31 compute-2 systemd[1]: Started Process Core Dump (PID 223492/UID 0).
Jan 23 10:10:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:32 compute-2 sudo[223621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvlnvocrkxmfyrcbkfsdxwcocbhevbnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163032.068858-3308-224917831361260/AnsiballZ_container_config_hash.py'
Jan 23 10:10:32 compute-2 sudo[223621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:10:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:32.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:10:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:32 compute-2 python3.9[223623]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 10:10:32 compute-2 sudo[223621]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:33 compute-2 sudo[223773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evgusffwtqulwofzxaspsynvmncebvux ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769163032.9053905-3338-179595599525092/AnsiballZ_edpm_container_manage.py'
Jan 23 10:10:33 compute-2 sudo[223773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:33.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:34.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:34 compute-2 python3[223775]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 10:10:34 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:10:34 compute-2 podman[223812]: 2026-01-23 10:10:34.674853522 +0000 UTC m=+0.022331940 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Jan 23 10:10:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:35.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:36.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:37.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:10:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:38.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:10:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:10:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:39.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:10:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:39 compute-2 ceph-mds[83039]: mds.beacon.cephfs.compute-2.prgzmm missed beacon ack from the monitors
Jan 23 10:10:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:40 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:10:40 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Cumulative writes: 3570 writes, 20K keys, 3570 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.04 MB/s
                                           Cumulative WAL: 3569 writes, 3569 syncs, 1.00 writes per sync, written: 0.05 GB, 0.04 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1383 writes, 6411 keys, 1383 commit groups, 1.0 writes per commit group, ingest: 16.17 MB, 0.03 MB/s
                                           Interval WAL: 1382 writes, 1382 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     84.9      0.33              0.13         9    0.036       0      0       0.0       0.0
                                             L6      1/0   12.81 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.5     73.9     64.1      1.53              0.51         8    0.191     39K   4177       0.0       0.0
                                            Sum      1/0   12.81 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.5     60.9     67.8      1.85              0.64        17    0.109     39K   4177       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   6.1     95.0     95.5      0.46              0.13         6    0.076     16K   1877       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0     73.9     64.1      1.53              0.51         8    0.191     39K   4177       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     85.4      0.32              0.13         8    0.041       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.027, interval 0.007
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.12 GB write, 0.10 MB/s write, 0.11 GB read, 0.09 MB/s read, 1.9 seconds
                                           Interval compaction: 0.04 GB write, 0.07 MB/s write, 0.04 GB read, 0.07 MB/s read, 0.5 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c6513709b0#2 capacity: 304.00 MB usage: 4.90 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000112 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(263,4.57 MB,1.50339%) FilterBlock(17,118.48 KB,0.0380616%) IndexBlock(17,221.48 KB,0.0711491%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Jan 23 10:10:40 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:10:40 compute-2 systemd-coredump[223493]: Process 202071 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 62:
                                                    #0  0x00007f8000eeb32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    #1  0x0000000000000000 n/a (n/a + 0x0)
                                                    #2  0x00007f8000ef5900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)
                                                    ELF object binary architecture: AMD x86-64
Jan 23 10:10:40 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).paxos(paxos updating c 1256..1981) lease_expire from mon.0 v2:192.168.122.100:3300/0 is 0.590739429s seconds in the past; mons are probably laggy (or possibly clocks are too skewed)
Jan 23 10:10:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2[75767]: 2026-01-23T10:10:40.289+0000 7fdb29d57640 -1 mon.compute-2@1(peon).paxos(paxos updating c 1256..1981) lease_expire from mon.0 v2:192.168.122.100:3300/0 is 0.590739429s seconds in the past; mons are probably laggy (or possibly clocks are too skewed)
Jan 23 10:10:40 compute-2 ceph-mon[75771]: pgmap v510: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:10:40 compute-2 podman[223812]: 2026-01-23 10:10:40.302420944 +0000 UTC m=+5.649899342 container create f99dfdec5ed1b13c4bc030b38f1e4e63000d9ea357b09a559200a7501b063f8e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:10:40 compute-2 python3[223775]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b kolla_start
Jan 23 10:10:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:10:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:40.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:10:40 compute-2 systemd[1]: systemd-coredump@8-223492-0.service: Deactivated successfully.
Jan 23 10:10:40 compute-2 systemd[1]: systemd-coredump@8-223492-0.service: Consumed 1.235s CPU time.
Jan 23 10:10:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:40 compute-2 podman[223854]: 2026-01-23 10:10:40.405620111 +0000 UTC m=+0.024027583 container died 075bd0906bffbd67deee1972885ea17223f9ea73d23664c030537d727fd0a3ab (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 10:10:40 compute-2 systemd[1]: var-lib-containers-storage-overlay-c2e895362fded70b70c158ad61887411a6935be2a9259c6a533cbeaa6d0ebd47-merged.mount: Deactivated successfully.
Jan 23 10:10:40 compute-2 sudo[223773]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:40 compute-2 podman[223854]: 2026-01-23 10:10:40.446071414 +0000 UTC m=+0.064478866 container remove 075bd0906bffbd67deee1972885ea17223f9ea73d23664c030537d727fd0a3ab (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid)
Jan 23 10:10:40 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Main process exited, code=exited, status=139/n/a
Jan 23 10:10:40 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 10:10:40 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.722s CPU time.
Jan 23 10:10:40 compute-2 sudo[224049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eldwoipukqiiurafjzjhctmmrjlzceak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163040.622534-3362-225948115028956/AnsiballZ_stat.py'
Jan 23 10:10:40 compute-2 sudo[224049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:41 compute-2 python3.9[224051]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:10:41 compute-2 sudo[224049]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:10:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:41.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:10:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:41 compute-2 sudo[224151]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:10:41 compute-2 sudo[224151]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:10:41 compute-2 sudo[224151]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:41 compute-2 sudo[224228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmediydgjhigeovkhzeffitzwowcqkgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163041.447674-3389-279724749744833/AnsiballZ_file.py'
Jan 23 10:10:41 compute-2 sudo[224228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:41 compute-2 python3.9[224230]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:10:41 compute-2 sudo[224228]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:42 compute-2 sudo[224380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxfjqxkovbwsvmjvmsyygxqxmzcvomai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163041.9439676-3389-84431096401087/AnsiballZ_copy.py'
Jan 23 10:10:42 compute-2 sudo[224380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:42.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:10:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:43.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:10:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:44.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:44 compute-2 python3.9[224382]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769163041.9439676-3389-84431096401087/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:10:44 compute-2 sudo[224380]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:44 compute-2 sudo[224459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxshemkfhhiengsyffketxirwybdymwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163041.9439676-3389-84431096401087/AnsiballZ_systemd.py'
Jan 23 10:10:44 compute-2 sudo[224459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:44 compute-2 python3.9[224461]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 10:10:44 compute-2 systemd[1]: Reloading.
Jan 23 10:10:45 compute-2 systemd-rc-local-generator[224490]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:10:45 compute-2 systemd-sysv-generator[224494]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:10:45 compute-2 ceph-mon[75771]: pgmap v511: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:45 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:10:45 compute-2 ceph-mon[75771]: pgmap v512: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:10:45 compute-2 ceph-mon[75771]: pgmap v513: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:10:45 compute-2 ceph-mon[75771]: pgmap v514: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:10:45 compute-2 ceph-mon[75771]: pgmap v515: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:45 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:10:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:45.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:45 compute-2 sudo[224459]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:45 compute-2 sudo[224571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrsxnujeaetqswzwrkivkqgifnzkysis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163041.9439676-3389-84431096401087/AnsiballZ_systemd.py'
Jan 23 10:10:45 compute-2 sudo[224571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/101045 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:10:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:45 compute-2 python3.9[224573]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:10:45 compute-2 systemd[1]: Reloading.
Jan 23 10:10:46 compute-2 systemd-sysv-generator[224607]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:10:46 compute-2 systemd-rc-local-generator[224604]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:10:46 compute-2 ceph-mon[75771]: pgmap v516: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:10:46 compute-2 systemd[1]: Starting nova_compute container...
Jan 23 10:10:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:46.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:46 compute-2 systemd[1]: Started libcrun container.
Jan 23 10:10:46 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3c2a27604130ae0c99245d8cc90e749391c91a55d1093354a2b3deb7500644d/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:46 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3c2a27604130ae0c99245d8cc90e749391c91a55d1093354a2b3deb7500644d/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:46 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3c2a27604130ae0c99245d8cc90e749391c91a55d1093354a2b3deb7500644d/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:46 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3c2a27604130ae0c99245d8cc90e749391c91a55d1093354a2b3deb7500644d/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:46 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3c2a27604130ae0c99245d8cc90e749391c91a55d1093354a2b3deb7500644d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:46 compute-2 podman[224614]: 2026-01-23 10:10:46.432064679 +0000 UTC m=+0.128288715 container init f99dfdec5ed1b13c4bc030b38f1e4e63000d9ea357b09a559200a7501b063f8e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=nova_compute)
Jan 23 10:10:46 compute-2 podman[224614]: 2026-01-23 10:10:46.436955969 +0000 UTC m=+0.133179985 container start f99dfdec5ed1b13c4bc030b38f1e4e63000d9ea357b09a559200a7501b063f8e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm)
Jan 23 10:10:46 compute-2 nova_compute[224630]: + sudo -E kolla_set_configs
Jan 23 10:10:46 compute-2 podman[224614]: nova_compute
Jan 23 10:10:46 compute-2 systemd[1]: Started nova_compute container.
Jan 23 10:10:46 compute-2 sudo[224571]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Validating config file
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Copying service configuration files
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Deleting /etc/ceph
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Creating directory /etc/ceph
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Setting permission for /etc/ceph
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Writing out command to execute
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 10:10:46 compute-2 nova_compute[224630]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 10:10:46 compute-2 nova_compute[224630]: ++ cat /run_command
Jan 23 10:10:46 compute-2 nova_compute[224630]: + CMD=nova-compute
Jan 23 10:10:46 compute-2 nova_compute[224630]: + ARGS=
Jan 23 10:10:46 compute-2 nova_compute[224630]: + sudo kolla_copy_cacerts
Jan 23 10:10:46 compute-2 nova_compute[224630]: + [[ ! -n '' ]]
Jan 23 10:10:46 compute-2 nova_compute[224630]: + . kolla_extend_start
Jan 23 10:10:46 compute-2 nova_compute[224630]: Running command: 'nova-compute'
Jan 23 10:10:46 compute-2 nova_compute[224630]: + echo 'Running command: '\''nova-compute'\'''
Jan 23 10:10:46 compute-2 nova_compute[224630]: + umask 0022
Jan 23 10:10:46 compute-2 nova_compute[224630]: + exec nova-compute
Jan 23 10:10:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:10:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:47.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:10:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:47 compute-2 ceph-mon[75771]: pgmap v517: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:47 compute-2 python3.9[224792]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:10:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:48.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:48 compute-2 python3.9[224944]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:10:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:49 compute-2 nova_compute[224630]: 2026-01-23 10:10:49.051 224634 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 23 10:10:49 compute-2 nova_compute[224630]: 2026-01-23 10:10:49.051 224634 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 23 10:10:49 compute-2 nova_compute[224630]: 2026-01-23 10:10:49.052 224634 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 23 10:10:49 compute-2 nova_compute[224630]: 2026-01-23 10:10:49.052 224634 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 23 10:10:49 compute-2 nova_compute[224630]: 2026-01-23 10:10:49.193 224634 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:10:49 compute-2 nova_compute[224630]: 2026-01-23 10:10:49.217 224634 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:10:49 compute-2 nova_compute[224630]: 2026-01-23 10:10:49.217 224634 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 23 10:10:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:49.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:49 compute-2 python3.9[225097]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:10:49 compute-2 ceph-mon[75771]: pgmap v518: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:10:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.041 224634 INFO nova.virt.driver [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.175 224634 INFO nova.compute.provider_config [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.184 224634 DEBUG oslo_concurrency.lockutils [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.184 224634 DEBUG oslo_concurrency.lockutils [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.184 224634 DEBUG oslo_concurrency.lockutils [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.185 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.185 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.185 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.185 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.185 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.185 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.185 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.186 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.186 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.186 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.186 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.186 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.186 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.186 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.187 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.187 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.187 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.187 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.187 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.187 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.187 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.188 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.188 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.188 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.188 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.188 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.188 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.188 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.189 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.189 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.189 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.189 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.189 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.189 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.189 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.190 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.190 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.190 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.190 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.190 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.190 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.190 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.191 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.191 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.191 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.191 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.191 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.191 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.192 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.192 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.192 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.192 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.192 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.192 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.193 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.193 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.193 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.193 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.193 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.193 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.193 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.194 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.194 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.194 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.194 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.194 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.194 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.194 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.194 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.195 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.195 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.195 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.195 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.195 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.195 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.195 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.196 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.196 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.196 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.196 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.196 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.196 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.196 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.197 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.197 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.197 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.197 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.197 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.197 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.198 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.198 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.198 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.198 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.198 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.198 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.198 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.198 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.199 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.199 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.199 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.199 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.199 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.199 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.199 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.200 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.200 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.200 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.200 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.200 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.200 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.200 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.200 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.201 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.201 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.201 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.201 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.201 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.201 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.201 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.202 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.202 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.202 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.202 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.202 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.202 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.202 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.203 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.203 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.203 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.203 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.203 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.203 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.203 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.203 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.204 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.204 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.204 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.204 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.204 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.204 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.204 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.204 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.205 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.205 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.205 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.205 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.205 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.205 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.205 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.206 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.206 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.206 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.206 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.206 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.206 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.206 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.207 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.207 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.207 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.207 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.207 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.207 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.207 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.208 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.208 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.208 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.208 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.208 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.208 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.208 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.209 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.209 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.209 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.209 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.209 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.209 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.209 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.210 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.210 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.210 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.210 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.210 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.210 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.210 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.211 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.211 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.211 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.211 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.211 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.211 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.212 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.212 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.212 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.212 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.212 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.212 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.212 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.212 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.213 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.213 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.213 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.213 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.213 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.213 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.214 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.214 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.214 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.214 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.214 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.214 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.214 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.215 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.215 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.215 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.215 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.215 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.215 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.216 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.216 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.216 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.216 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.216 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.216 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.217 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.217 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.217 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.217 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.217 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.217 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.218 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.218 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.218 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.218 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.218 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.218 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.219 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.219 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.219 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.219 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.219 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.220 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.220 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.220 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.220 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.220 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.220 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.220 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.221 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.221 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.221 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.221 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.221 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.221 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.221 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.222 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.222 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.222 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.222 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.222 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.222 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.222 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.223 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.223 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.223 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.223 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.223 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.223 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.224 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.224 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.224 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.224 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.224 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.224 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.224 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.225 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.225 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.225 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.225 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.225 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.225 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.225 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.226 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.226 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.226 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.226 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.226 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.226 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.226 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.227 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.227 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.227 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.227 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.227 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.227 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.227 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.228 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.228 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.228 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.228 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.228 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.228 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.229 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.229 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.229 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.229 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.229 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.229 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.230 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.230 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.230 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.230 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.230 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.230 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.230 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.231 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.231 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.231 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.231 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.231 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.231 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.231 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.232 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.232 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.232 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.232 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.232 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.232 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.233 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.233 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.233 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.233 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.233 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.233 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.234 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.234 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.234 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.234 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.234 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.234 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.234 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.235 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.235 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.235 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.235 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.235 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.235 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.235 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.236 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.236 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.236 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.236 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.236 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.236 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.236 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.237 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.237 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.237 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.237 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.237 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.237 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.238 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.238 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.238 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.238 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.238 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.238 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.238 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.239 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.239 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.239 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.239 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.239 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.239 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.239 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.240 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.240 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.240 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.240 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.240 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.240 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.240 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.241 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.241 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.241 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.241 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.241 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.241 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.241 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.242 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.242 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.242 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.242 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.242 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.242 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.242 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.243 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.243 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.243 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.243 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.243 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.243 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.243 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.244 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.244 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.244 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.244 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.244 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.244 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.244 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.244 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.245 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.245 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.245 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.245 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.245 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.245 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.245 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.246 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.246 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.246 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.246 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.246 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.246 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.246 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.246 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.247 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.247 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.247 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.247 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.247 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.247 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.247 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.248 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.248 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.248 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.248 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.248 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.248 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.248 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.248 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.249 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.249 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.249 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.249 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.249 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.249 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.249 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.250 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.250 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.250 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.250 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.250 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.250 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.250 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.251 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.251 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.251 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.251 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.251 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.251 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.251 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.251 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.252 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.252 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.252 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.252 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.252 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.252 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.252 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.253 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.253 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.253 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.253 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.253 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.253 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.254 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.254 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.254 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.254 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.254 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.254 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.254 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.254 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.255 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.255 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.255 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.255 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.255 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.255 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.255 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.256 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.256 224634 WARNING oslo_config.cfg [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 23 10:10:50 compute-2 nova_compute[224630]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 23 10:10:50 compute-2 nova_compute[224630]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 23 10:10:50 compute-2 nova_compute[224630]: and ``live_migration_inbound_addr`` respectively.
Jan 23 10:10:50 compute-2 nova_compute[224630]: ).  Its value may be silently ignored in the future.
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.256 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.256 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.256 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.256 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.257 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.257 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.257 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.257 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.257 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.257 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.258 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.258 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.258 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.258 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.258 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.258 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.259 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.259 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.259 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.rbd_secret_uuid        = f3005f84-239a-55b6-a948-8f1fb592b920 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.259 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.259 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.259 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.260 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.260 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.260 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.260 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.260 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.260 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.261 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.261 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.261 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.261 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.261 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.261 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.262 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.262 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.262 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.262 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.262 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.262 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.262 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.263 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.263 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.263 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.263 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.263 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.263 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.263 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.264 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.264 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.264 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.264 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.264 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.264 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.264 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.265 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.265 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.265 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.265 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.265 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.266 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.266 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.266 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.266 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.266 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.266 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.266 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.266 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.267 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.267 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.267 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.267 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.267 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.267 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.268 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.268 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.268 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.268 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.268 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.268 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.268 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.268 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.269 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.269 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.269 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.269 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.269 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.269 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.269 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.270 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.270 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.270 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.270 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.270 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.270 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.270 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.271 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.271 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.271 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.271 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.271 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.271 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.271 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.272 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.272 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.272 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.272 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.272 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.272 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.272 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.273 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.273 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.273 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.273 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.273 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.273 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.273 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.274 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.274 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.274 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.274 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.274 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.274 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.274 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.274 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.275 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.275 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.275 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.275 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.275 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.275 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.275 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.276 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.276 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.276 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.276 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.276 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.276 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.276 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.277 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.277 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.277 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.277 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.277 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.277 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.278 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.278 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.278 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.278 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.278 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.278 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.278 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.279 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.279 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.279 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.279 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.279 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.279 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.279 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.280 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.280 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.280 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.280 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.280 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.280 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.281 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.281 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.281 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.281 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.281 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.281 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.281 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.282 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.282 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.282 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.282 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.282 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.282 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.282 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.283 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.283 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.283 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.283 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.283 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.283 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.284 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.284 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.284 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.284 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.284 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.284 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.284 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.285 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.285 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.285 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.285 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.285 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.285 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.286 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.286 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.286 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.286 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.286 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.286 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.286 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.287 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.287 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.287 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.287 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.287 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.287 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.287 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.287 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.288 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.288 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.288 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.288 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.288 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.288 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.288 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.289 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.289 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.289 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.289 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.289 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.289 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.289 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.290 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.290 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.290 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.290 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.290 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.291 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.291 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.291 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.291 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.291 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.292 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.292 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.292 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.292 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.292 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.292 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.292 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.293 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.293 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.293 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.293 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.293 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.294 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.294 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.294 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.294 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.294 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.294 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.295 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.295 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.295 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.295 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.295 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.295 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.295 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.296 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.296 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.296 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.296 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.296 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.296 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.296 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.297 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.297 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.297 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.297 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.297 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.297 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.297 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.298 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.298 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.298 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.298 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.298 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.298 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.298 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.299 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.299 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.299 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.299 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.299 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.299 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.299 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.300 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.300 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.300 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.300 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.300 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.300 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.300 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.301 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.301 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.301 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.301 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.301 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.301 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.302 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.302 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.302 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.302 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.303 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.303 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.303 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.303 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.303 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.304 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.304 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.304 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.304 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.304 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.304 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.304 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.304 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.305 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.305 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.305 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.305 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.305 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.305 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.305 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.306 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.306 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.306 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.306 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.306 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.306 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.307 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.307 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.307 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.307 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.307 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.307 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.307 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.308 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.308 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.308 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.308 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.308 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.308 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.308 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.309 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.309 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.309 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.309 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.309 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.309 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.309 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.309 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.310 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.310 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.310 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.310 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.310 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.310 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.310 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.311 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.311 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.311 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.311 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.311 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.311 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.311 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.312 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.312 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.312 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.312 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.312 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.312 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.312 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.313 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.313 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.313 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.313 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.313 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.313 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.313 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.314 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.314 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.314 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.314 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.314 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.314 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.314 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.315 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.315 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.315 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.315 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.315 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.315 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.316 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.316 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.316 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.316 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.316 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.316 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.316 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.316 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.317 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.317 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.317 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.317 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.317 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.317 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.317 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.318 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.318 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.318 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.318 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.318 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.318 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.318 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.319 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.319 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.319 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.319 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.319 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.319 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.320 224634 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.337 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.338 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.338 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.338 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Jan 23 10:10:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:50.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:50 compute-2 systemd[1]: Starting libvirt QEMU daemon...
Jan 23 10:10:50 compute-2 systemd[1]: Started libvirt QEMU daemon.
Jan 23 10:10:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.415 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f2f79e52760> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.417 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f2f79e52760> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.418 224634 INFO nova.virt.libvirt.driver [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Connection event '1' reason 'None'
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.435 224634 WARNING nova.virt.libvirt.driver [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Cannot update service status on host "compute-2.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.
Jan 23 10:10:50 compute-2 nova_compute[224630]: 2026-01-23 10:10:50.435 224634 DEBUG nova.virt.libvirt.volume.mount [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 23 10:10:50 compute-2 sudo[225294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwuxvfwblrvsdcviaybhuqlcjltijxrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163049.9828532-3569-138421803189023/AnsiballZ_podman_container.py'
Jan 23 10:10:50 compute-2 sudo[225294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:50 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Scheduled restart job, restart counter is at 9.
Jan 23 10:10:50 compute-2 python3.9[225301]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 23 10:10:50 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:10:50 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.722s CPU time.
Jan 23 10:10:50 compute-2 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 10:10:50 compute-2 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 10:10:50 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 10:10:50 compute-2 sudo[225294]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:51 compute-2 podman[225400]: 2026-01-23 10:10:50.918656323 +0000 UTC m=+0.020642779 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 10:10:51 compute-2 nova_compute[224630]: 2026-01-23 10:10:51.278 224634 INFO nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Libvirt host capabilities <capabilities>
Jan 23 10:10:51 compute-2 nova_compute[224630]: 
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <host>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <uuid>84c28ede-4112-4d76-8f99-c7405a7d029c</uuid>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <cpu>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <arch>x86_64</arch>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model>EPYC-Rome-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <vendor>AMD</vendor>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <microcode version='16777317'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <signature family='23' model='49' stepping='0'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <maxphysaddr mode='emulate' bits='40'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature name='x2apic'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature name='tsc-deadline'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature name='osxsave'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature name='hypervisor'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature name='tsc_adjust'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature name='spec-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature name='stibp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature name='arch-capabilities'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature name='ssbd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature name='cmp_legacy'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature name='topoext'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature name='virt-ssbd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature name='lbrv'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature name='tsc-scale'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature name='vmcb-clean'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature name='pause-filter'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature name='pfthreshold'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature name='svme-addr-chk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature name='rdctl-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature name='skip-l1dfl-vmentry'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature name='mds-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature name='pschange-mc-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <pages unit='KiB' size='4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <pages unit='KiB' size='2048'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <pages unit='KiB' size='1048576'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </cpu>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <power_management>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <suspend_mem/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </power_management>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <iommu support='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <migration_features>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <live/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <uri_transports>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <uri_transport>tcp</uri_transport>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <uri_transport>rdma</uri_transport>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </uri_transports>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </migration_features>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <topology>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <cells num='1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <cell id='0'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:           <memory unit='KiB'>7864316</memory>
Jan 23 10:10:51 compute-2 nova_compute[224630]:           <pages unit='KiB' size='4'>1966079</pages>
Jan 23 10:10:51 compute-2 nova_compute[224630]:           <pages unit='KiB' size='2048'>0</pages>
Jan 23 10:10:51 compute-2 nova_compute[224630]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 23 10:10:51 compute-2 nova_compute[224630]:           <distances>
Jan 23 10:10:51 compute-2 nova_compute[224630]:             <sibling id='0' value='10'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:           </distances>
Jan 23 10:10:51 compute-2 nova_compute[224630]:           <cpus num='8'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:           </cpus>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         </cell>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </cells>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </topology>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <cache>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </cache>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <secmodel>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model>selinux</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <doi>0</doi>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </secmodel>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <secmodel>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model>dac</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <doi>0</doi>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </secmodel>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   </host>
Jan 23 10:10:51 compute-2 nova_compute[224630]: 
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <guest>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <os_type>hvm</os_type>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <arch name='i686'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <wordsize>32</wordsize>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <domain type='qemu'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <domain type='kvm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </arch>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <features>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <pae/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <nonpae/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <acpi default='on' toggle='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <apic default='on' toggle='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <cpuselection/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <deviceboot/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <disksnapshot default='on' toggle='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <externalSnapshot/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </features>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   </guest>
Jan 23 10:10:51 compute-2 nova_compute[224630]: 
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <guest>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <os_type>hvm</os_type>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <arch name='x86_64'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <wordsize>64</wordsize>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <domain type='qemu'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <domain type='kvm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </arch>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <features>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <acpi default='on' toggle='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <apic default='on' toggle='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <cpuselection/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <deviceboot/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <disksnapshot default='on' toggle='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <externalSnapshot/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </features>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   </guest>
Jan 23 10:10:51 compute-2 nova_compute[224630]: 
Jan 23 10:10:51 compute-2 nova_compute[224630]: </capabilities>
Jan 23 10:10:51 compute-2 nova_compute[224630]: 
Jan 23 10:10:51 compute-2 nova_compute[224630]: 2026-01-23 10:10:51.285 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 23 10:10:51 compute-2 nova_compute[224630]: 2026-01-23 10:10:51.302 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 23 10:10:51 compute-2 nova_compute[224630]: <domainCapabilities>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <path>/usr/libexec/qemu-kvm</path>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <domain>kvm</domain>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <arch>i686</arch>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <vcpu max='240'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <iothreads supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <os supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <enum name='firmware'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <loader supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='type'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>rom</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>pflash</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='readonly'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>yes</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>no</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='secure'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>no</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </loader>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   </os>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <cpu>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <mode name='host-passthrough' supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='hostPassthroughMigratable'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>on</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>off</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </mode>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <mode name='maximum' supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='maximumMigratable'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>on</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>off</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </mode>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <mode name='host-model' supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <vendor>AMD</vendor>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='x2apic'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='tsc-deadline'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='hypervisor'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='tsc_adjust'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='spec-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='stibp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='ssbd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='cmp_legacy'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='overflow-recov'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='succor'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='ibrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='amd-ssbd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='virt-ssbd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='lbrv'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='tsc-scale'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='vmcb-clean'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='flushbyasid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='pause-filter'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='pfthreshold'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='svme-addr-chk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='disable' name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </mode>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <mode name='custom' supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-noTSX'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server-v5'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='ClearwaterForest'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bhi-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cmpccxadd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ddpd-u'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='intel-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='lam'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchiti'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sha512'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sm3'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sm4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='ClearwaterForest-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bhi-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cmpccxadd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ddpd-u'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='intel-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='lam'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchiti'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sha512'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sm3'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sm4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cooperlake'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cooperlake-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cooperlake-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Denverton'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mpx'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Denverton-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mpx'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Denverton-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Denverton-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Dhyana-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Genoa'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='auto-ibrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Genoa-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='auto-ibrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Genoa-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='auto-ibrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='perfmon-v2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Milan'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Milan-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Milan-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Milan-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Rome'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Rome-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Rome-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Rome-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Turin'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='auto-ibrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='perfmon-v2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbpb'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Turin-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='auto-ibrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='perfmon-v2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbpb'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-v5'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='GraniteRapids'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchiti'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='GraniteRapids-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchiti'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='GraniteRapids-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10-128'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10-256'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10-512'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchiti'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='GraniteRapids-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10-128'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10-256'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10-512'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchiti'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-noTSX'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-noTSX'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 10:10:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v5'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v6'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:51.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v7'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='IvyBridge'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='IvyBridge-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='IvyBridge-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='IvyBridge-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='KnightsMill'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512er'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512pf'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='KnightsMill-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512er'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512pf'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Opteron_G4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fma4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xop'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Opteron_G4-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fma4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xop'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Opteron_G5'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fma4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tbm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xop'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Opteron_G5-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fma4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tbm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xop'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SapphireRapids'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SapphireRapids-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SapphireRapids-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SapphireRapids-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SapphireRapids-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SierraForest'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cmpccxadd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SierraForest-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cmpccxadd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SierraForest-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cmpccxadd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='intel-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='lam'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SierraForest-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cmpccxadd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='intel-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='lam'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-v5'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Snowridge'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='core-capability'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mpx'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='split-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Snowridge-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='core-capability'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mpx'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='split-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Snowridge-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='core-capability'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='split-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Snowridge-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='core-capability'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='split-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Snowridge-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='athlon'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnow'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnowext'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='athlon-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnow'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnowext'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='core2duo'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='core2duo-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='coreduo'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='coreduo-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='n270'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='n270-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='phenom'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnow'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnowext'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='phenom-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnow'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnowext'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </mode>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   </cpu>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <memoryBacking supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <enum name='sourceType'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <value>file</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <value>anonymous</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <value>memfd</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   </memoryBacking>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <devices>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <disk supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='diskDevice'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>disk</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>cdrom</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>floppy</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>lun</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='bus'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>ide</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>fdc</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>scsi</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>usb</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>sata</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='model'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio-transitional</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio-non-transitional</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </disk>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <graphics supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='type'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>vnc</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>egl-headless</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>dbus</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </graphics>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <video supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='modelType'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>vga</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>cirrus</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>none</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>bochs</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>ramfb</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </video>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <hostdev supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='mode'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>subsystem</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='startupPolicy'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>default</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>mandatory</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>requisite</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>optional</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='subsysType'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>usb</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>pci</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>scsi</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='capsType'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='pciBackend'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </hostdev>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <rng supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='model'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio-transitional</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio-non-transitional</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='backendModel'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>random</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>egd</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>builtin</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </rng>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <filesystem supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='driverType'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>path</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>handle</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtiofs</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </filesystem>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <tpm supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='model'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>tpm-tis</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>tpm-crb</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='backendModel'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>emulator</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>external</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='backendVersion'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>2.0</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </tpm>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <redirdev supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='bus'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>usb</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </redirdev>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <channel supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='type'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>pty</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>unix</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </channel>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <crypto supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='model'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='type'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>qemu</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='backendModel'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>builtin</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </crypto>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <interface supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='backendType'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>default</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>passt</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </interface>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <panic supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='model'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>isa</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>hyperv</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </panic>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <console supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='type'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>null</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>vc</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>pty</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>dev</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>file</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>pipe</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>stdio</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>udp</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>tcp</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>unix</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>qemu-vdagent</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>dbus</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </console>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   </devices>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <features>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <gic supported='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <vmcoreinfo supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <genid supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <backingStoreInput supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <backup supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <async-teardown supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <s390-pv supported='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <ps2 supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <tdx supported='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <sev supported='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <sgx supported='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <hyperv supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='features'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>relaxed</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>vapic</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>spinlocks</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>vpindex</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>runtime</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>synic</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>stimer</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>reset</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>vendor_id</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>frequencies</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>reenlightenment</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>tlbflush</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>ipi</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>avic</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>emsr_bitmap</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>xmm_input</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <defaults>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <spinlocks>4095</spinlocks>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <stimer_direct>on</stimer_direct>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <tlbflush_direct>on</tlbflush_direct>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <tlbflush_extended>on</tlbflush_extended>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </defaults>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </hyperv>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <launchSecurity supported='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   </features>
Jan 23 10:10:51 compute-2 nova_compute[224630]: </domainCapabilities>
Jan 23 10:10:51 compute-2 nova_compute[224630]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 23 10:10:51 compute-2 nova_compute[224630]: 2026-01-23 10:10:51.308 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 23 10:10:51 compute-2 nova_compute[224630]: <domainCapabilities>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <path>/usr/libexec/qemu-kvm</path>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <domain>kvm</domain>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <arch>i686</arch>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <vcpu max='4096'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <iothreads supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <os supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <enum name='firmware'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <loader supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='type'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>rom</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>pflash</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='readonly'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>yes</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>no</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='secure'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>no</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </loader>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   </os>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <cpu>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <mode name='host-passthrough' supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='hostPassthroughMigratable'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>on</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>off</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </mode>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <mode name='maximum' supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='maximumMigratable'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>on</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>off</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </mode>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <mode name='host-model' supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <vendor>AMD</vendor>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='x2apic'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='tsc-deadline'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='hypervisor'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='tsc_adjust'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='spec-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='stibp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='ssbd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='cmp_legacy'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='overflow-recov'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='succor'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='ibrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='amd-ssbd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='virt-ssbd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='lbrv'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='tsc-scale'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='vmcb-clean'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='flushbyasid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='pause-filter'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='pfthreshold'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='svme-addr-chk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='disable' name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </mode>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <mode name='custom' supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-noTSX'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server-v5'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='ClearwaterForest'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bhi-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cmpccxadd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ddpd-u'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='intel-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='lam'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchiti'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sha512'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sm3'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sm4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='ClearwaterForest-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bhi-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cmpccxadd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ddpd-u'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='intel-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='lam'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchiti'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sha512'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sm3'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sm4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cooperlake'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cooperlake-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cooperlake-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Denverton'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mpx'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Denverton-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mpx'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Denverton-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Denverton-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Dhyana-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Genoa'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='auto-ibrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Genoa-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='auto-ibrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Genoa-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='auto-ibrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='perfmon-v2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Milan'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Milan-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Milan-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Milan-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Rome'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Rome-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Rome-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Rome-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Turin'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='auto-ibrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='perfmon-v2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbpb'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Turin-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='auto-ibrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='perfmon-v2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbpb'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-v5'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='GraniteRapids'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchiti'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='GraniteRapids-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchiti'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='GraniteRapids-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10-128'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10-256'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10-512'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchiti'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='GraniteRapids-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10-128'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10-256'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10-512'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchiti'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-noTSX'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-noTSX'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v5'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 10:10:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v6'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v7'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='IvyBridge'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='IvyBridge-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='IvyBridge-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='IvyBridge-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='KnightsMill'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512er'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512pf'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='KnightsMill-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512er'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512pf'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Opteron_G4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fma4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xop'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Opteron_G4-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fma4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xop'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Opteron_G5'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fma4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tbm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xop'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Opteron_G5-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fma4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tbm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xop'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SapphireRapids'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SapphireRapids-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SapphireRapids-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SapphireRapids-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SapphireRapids-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SierraForest'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cmpccxadd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SierraForest-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cmpccxadd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SierraForest-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cmpccxadd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='intel-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='lam'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SierraForest-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cmpccxadd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='intel-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='lam'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-v5'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Snowridge'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='core-capability'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mpx'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='split-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Snowridge-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='core-capability'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mpx'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='split-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Snowridge-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='core-capability'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='split-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Snowridge-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='core-capability'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='split-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Snowridge-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='athlon'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnow'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnowext'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='athlon-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnow'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnowext'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='core2duo'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='core2duo-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='coreduo'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='coreduo-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='n270'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='n270-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='phenom'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnow'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnowext'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='phenom-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnow'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnowext'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </mode>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   </cpu>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <memoryBacking supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <enum name='sourceType'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <value>file</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <value>anonymous</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <value>memfd</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   </memoryBacking>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <devices>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <disk supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='diskDevice'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>disk</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>cdrom</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>floppy</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>lun</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='bus'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>fdc</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>scsi</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>usb</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>sata</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='model'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio-transitional</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio-non-transitional</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </disk>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <graphics supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='type'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>vnc</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>egl-headless</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>dbus</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </graphics>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <video supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='modelType'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>vga</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>cirrus</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>none</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>bochs</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>ramfb</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </video>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <hostdev supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='mode'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>subsystem</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='startupPolicy'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>default</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>mandatory</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>requisite</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>optional</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='subsysType'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>usb</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>pci</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>scsi</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='capsType'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='pciBackend'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </hostdev>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <rng supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='model'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio-transitional</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio-non-transitional</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='backendModel'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>random</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>egd</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>builtin</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </rng>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <filesystem supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='driverType'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>path</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>handle</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtiofs</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </filesystem>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <tpm supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='model'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>tpm-tis</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>tpm-crb</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='backendModel'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>emulator</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>external</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='backendVersion'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>2.0</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </tpm>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <redirdev supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='bus'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>usb</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </redirdev>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <channel supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='type'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>pty</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>unix</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </channel>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <crypto supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='model'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='type'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>qemu</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='backendModel'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>builtin</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </crypto>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <interface supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='backendType'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>default</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>passt</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </interface>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <panic supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='model'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>isa</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>hyperv</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </panic>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <console supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='type'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>null</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>vc</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>pty</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>dev</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>file</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>pipe</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>stdio</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>udp</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>tcp</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>unix</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>qemu-vdagent</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>dbus</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </console>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   </devices>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <features>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <gic supported='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <vmcoreinfo supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <genid supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <backingStoreInput supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <backup supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <async-teardown supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <s390-pv supported='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <ps2 supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <tdx supported='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <sev supported='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <sgx supported='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <hyperv supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='features'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>relaxed</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>vapic</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>spinlocks</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>vpindex</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>runtime</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>synic</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>stimer</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>reset</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>vendor_id</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>frequencies</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>reenlightenment</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>tlbflush</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>ipi</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>avic</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>emsr_bitmap</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>xmm_input</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <defaults>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <spinlocks>4095</spinlocks>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <stimer_direct>on</stimer_direct>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <tlbflush_direct>on</tlbflush_direct>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <tlbflush_extended>on</tlbflush_extended>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </defaults>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </hyperv>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <launchSecurity supported='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   </features>
Jan 23 10:10:51 compute-2 nova_compute[224630]: </domainCapabilities>
Jan 23 10:10:51 compute-2 nova_compute[224630]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 23 10:10:51 compute-2 nova_compute[224630]: 2026-01-23 10:10:51.358 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 23 10:10:51 compute-2 nova_compute[224630]: 2026-01-23 10:10:51.362 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 23 10:10:51 compute-2 nova_compute[224630]: <domainCapabilities>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <path>/usr/libexec/qemu-kvm</path>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <domain>kvm</domain>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <arch>x86_64</arch>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <vcpu max='240'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <iothreads supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <os supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <enum name='firmware'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <loader supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='type'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>rom</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>pflash</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='readonly'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>yes</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>no</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='secure'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>no</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </loader>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   </os>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <cpu>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <mode name='host-passthrough' supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='hostPassthroughMigratable'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>on</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>off</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </mode>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <mode name='maximum' supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='maximumMigratable'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>on</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>off</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </mode>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <mode name='host-model' supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <vendor>AMD</vendor>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='x2apic'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='tsc-deadline'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='hypervisor'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='tsc_adjust'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='spec-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='stibp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='ssbd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='cmp_legacy'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='overflow-recov'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='succor'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='ibrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='amd-ssbd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='virt-ssbd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='lbrv'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='tsc-scale'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='vmcb-clean'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='flushbyasid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='pause-filter'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='pfthreshold'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='svme-addr-chk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='disable' name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </mode>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <mode name='custom' supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-noTSX'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server-v5'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='ClearwaterForest'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bhi-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cmpccxadd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ddpd-u'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='intel-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='lam'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchiti'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sha512'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sm3'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sm4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='ClearwaterForest-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bhi-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cmpccxadd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ddpd-u'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='intel-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='lam'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchiti'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sha512'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sm3'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sm4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cooperlake'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cooperlake-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cooperlake-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Denverton'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mpx'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Denverton-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mpx'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Denverton-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Denverton-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Dhyana-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Genoa'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='auto-ibrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Genoa-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='auto-ibrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Genoa-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='auto-ibrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='perfmon-v2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Milan'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Milan-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Milan-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Milan-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Rome'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 sudo[225548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecnxxrrxcsrzhjscszxdqukhljcqwsdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163051.226287-3594-21089424886954/AnsiballZ_systemd.py'
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Rome-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Rome-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Rome-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Turin'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='auto-ibrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='perfmon-v2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbpb'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Turin-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='auto-ibrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 sudo[225548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='perfmon-v2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbpb'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-v5'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='GraniteRapids'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchiti'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='GraniteRapids-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchiti'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='GraniteRapids-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10-128'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10-256'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10-512'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchiti'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='GraniteRapids-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10-128'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10-256'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10-512'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchiti'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-noTSX'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-noTSX'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v5'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v6'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v7'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='IvyBridge'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='IvyBridge-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='IvyBridge-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='IvyBridge-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='KnightsMill'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512er'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512pf'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='KnightsMill-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512er'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512pf'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Opteron_G4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fma4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xop'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Opteron_G4-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fma4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xop'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Opteron_G5'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fma4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tbm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xop'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Opteron_G5-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fma4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tbm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xop'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SapphireRapids'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SapphireRapids-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SapphireRapids-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SapphireRapids-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SapphireRapids-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SierraForest'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cmpccxadd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SierraForest-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cmpccxadd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SierraForest-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cmpccxadd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='intel-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='lam'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SierraForest-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cmpccxadd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='intel-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='lam'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-v5'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Snowridge'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='core-capability'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mpx'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='split-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Snowridge-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='core-capability'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mpx'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='split-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Snowridge-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='core-capability'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='split-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Snowridge-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='core-capability'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='split-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Snowridge-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='athlon'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnow'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnowext'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='athlon-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnow'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnowext'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='core2duo'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='core2duo-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='coreduo'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='coreduo-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='n270'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='n270-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='phenom'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnow'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnowext'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='phenom-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnow'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnowext'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </mode>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   </cpu>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <memoryBacking supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <enum name='sourceType'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <value>file</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <value>anonymous</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <value>memfd</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   </memoryBacking>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <devices>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <disk supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='diskDevice'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>disk</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>cdrom</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>floppy</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>lun</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='bus'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>ide</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>fdc</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>scsi</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>usb</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>sata</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='model'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio-transitional</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio-non-transitional</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </disk>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <graphics supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='type'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>vnc</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>egl-headless</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>dbus</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </graphics>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <video supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='modelType'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>vga</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>cirrus</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>none</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>bochs</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>ramfb</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </video>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <hostdev supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='mode'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>subsystem</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='startupPolicy'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>default</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>mandatory</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>requisite</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>optional</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='subsysType'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>usb</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>pci</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>scsi</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='capsType'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='pciBackend'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </hostdev>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <rng supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='model'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio-transitional</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio-non-transitional</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='backendModel'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>random</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>egd</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>builtin</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </rng>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <filesystem supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='driverType'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>path</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>handle</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtiofs</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </filesystem>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <tpm supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='model'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>tpm-tis</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>tpm-crb</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='backendModel'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>emulator</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>external</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='backendVersion'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>2.0</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </tpm>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <redirdev supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='bus'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>usb</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </redirdev>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <channel supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='type'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>pty</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>unix</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </channel>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <crypto supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='model'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='type'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>qemu</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='backendModel'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>builtin</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </crypto>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <interface supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='backendType'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>default</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>passt</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </interface>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <panic supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='model'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>isa</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>hyperv</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </panic>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <console supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='type'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>null</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>vc</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>pty</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>dev</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>file</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>pipe</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>stdio</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>udp</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>tcp</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>unix</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>qemu-vdagent</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>dbus</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </console>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   </devices>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <features>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <gic supported='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <vmcoreinfo supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <genid supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <backingStoreInput supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <backup supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <async-teardown supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <s390-pv supported='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <ps2 supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <tdx supported='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <sev supported='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <sgx supported='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <hyperv supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='features'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>relaxed</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>vapic</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>spinlocks</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>vpindex</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>runtime</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>synic</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>stimer</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>reset</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>vendor_id</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>frequencies</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>reenlightenment</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>tlbflush</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>ipi</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>avic</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>emsr_bitmap</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>xmm_input</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <defaults>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <spinlocks>4095</spinlocks>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <stimer_direct>on</stimer_direct>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <tlbflush_direct>on</tlbflush_direct>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <tlbflush_extended>on</tlbflush_extended>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </defaults>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </hyperv>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <launchSecurity supported='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   </features>
Jan 23 10:10:51 compute-2 nova_compute[224630]: </domainCapabilities>
Jan 23 10:10:51 compute-2 nova_compute[224630]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 23 10:10:51 compute-2 nova_compute[224630]: 2026-01-23 10:10:51.438 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 23 10:10:51 compute-2 nova_compute[224630]: <domainCapabilities>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <path>/usr/libexec/qemu-kvm</path>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <domain>kvm</domain>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <arch>x86_64</arch>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <vcpu max='4096'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <iothreads supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <os supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <enum name='firmware'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <value>efi</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <loader supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='type'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>rom</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>pflash</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='readonly'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>yes</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>no</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='secure'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>yes</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>no</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </loader>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   </os>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <cpu>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <mode name='host-passthrough' supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='hostPassthroughMigratable'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>on</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>off</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </mode>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <mode name='maximum' supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='maximumMigratable'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>on</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>off</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </mode>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <mode name='host-model' supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <vendor>AMD</vendor>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='x2apic'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='tsc-deadline'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='hypervisor'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='tsc_adjust'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='spec-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='stibp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='ssbd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='cmp_legacy'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='overflow-recov'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='succor'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='ibrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='amd-ssbd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='virt-ssbd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='lbrv'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='tsc-scale'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='vmcb-clean'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='flushbyasid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='pause-filter'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='pfthreshold'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='svme-addr-chk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <feature policy='disable' name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </mode>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <mode name='custom' supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-noTSX'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Broadwell-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cascadelake-Server-v5'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='ClearwaterForest'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bhi-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cmpccxadd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ddpd-u'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='intel-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='lam'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchiti'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sha512'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sm3'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sm4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='ClearwaterForest-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bhi-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cmpccxadd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ddpd-u'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='intel-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='lam'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchiti'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sha512'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sm3'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sm4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cooperlake'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cooperlake-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Cooperlake-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Denverton'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mpx'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Denverton-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mpx'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Denverton-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Denverton-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Dhyana-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Genoa'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='auto-ibrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Genoa-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='auto-ibrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Genoa-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='auto-ibrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='perfmon-v2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Milan'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Milan-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Milan-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Milan-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Rome'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Rome-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Rome-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Rome-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Turin'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='auto-ibrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='perfmon-v2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbpb'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-Turin-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amd-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='auto-ibrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='perfmon-v2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbpb'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='stibp-always-on'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='EPYC-v5'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='GraniteRapids'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchiti'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='GraniteRapids-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchiti'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='GraniteRapids-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10-128'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10-256'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10-512'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchiti'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='GraniteRapids-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10-128'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10-256'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx10-512'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='prefetchiti'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-noTSX'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Haswell-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-noTSX'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v5'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v6'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Icelake-Server-v7'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='IvyBridge'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='IvyBridge-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='IvyBridge-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='IvyBridge-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='KnightsMill'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512er'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512pf'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='KnightsMill-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512er'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512pf'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Opteron_G4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fma4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xop'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Opteron_G4-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fma4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xop'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Opteron_G5'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fma4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tbm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xop'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Opteron_G5-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fma4'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tbm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xop'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SapphireRapids'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SapphireRapids-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SapphireRapids-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SapphireRapids-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SapphireRapids-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='amx-tile'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-bf16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-fp16'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bitalg'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrc'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fzrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='la57'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='taa-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SierraForest'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cmpccxadd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SierraForest-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cmpccxadd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SierraForest-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cmpccxadd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='intel-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='lam'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='SierraForest-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ifma'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cmpccxadd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fbsdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='fsrs'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ibrs-all'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='intel-psfd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='lam'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mcdt-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pbrsb-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='psdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='serialize'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vaes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Client-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='hle'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='rtm'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Skylake-Server-v5'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512bw'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512cd'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512dq'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512f'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='avx512vl'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='invpcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pcid'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='pku'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Snowridge'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='core-capability'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mpx'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='split-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Snowridge-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='core-capability'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='mpx'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='split-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Snowridge-v2'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='core-capability'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='split-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Snowridge-v3'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='core-capability'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='split-lock-detect'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='Snowridge-v4'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='cldemote'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='erms'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='gfni'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdir64b'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='movdiri'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='xsaves'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='athlon'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnow'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnowext'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='athlon-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnow'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnowext'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='core2duo'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='core2duo-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='coreduo'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='coreduo-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='n270'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='n270-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='ss'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='phenom'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnow'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnowext'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <blockers model='phenom-v1'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnow'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <feature name='3dnowext'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </blockers>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </mode>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   </cpu>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <memoryBacking supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <enum name='sourceType'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <value>file</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <value>anonymous</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <value>memfd</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   </memoryBacking>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <devices>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <disk supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='diskDevice'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>disk</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>cdrom</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>floppy</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>lun</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='bus'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>fdc</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>scsi</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>usb</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>sata</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='model'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio-transitional</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio-non-transitional</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </disk>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <graphics supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='type'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>vnc</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>egl-headless</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>dbus</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </graphics>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <video supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='modelType'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>vga</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>cirrus</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>none</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>bochs</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>ramfb</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </video>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <hostdev supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='mode'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>subsystem</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='startupPolicy'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>default</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>mandatory</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>requisite</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>optional</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='subsysType'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>usb</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>pci</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>scsi</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='capsType'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='pciBackend'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </hostdev>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <rng supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='model'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio-transitional</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtio-non-transitional</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='backendModel'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>random</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>egd</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>builtin</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </rng>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <filesystem supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='driverType'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>path</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>handle</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>virtiofs</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </filesystem>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <tpm supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='model'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>tpm-tis</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>tpm-crb</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='backendModel'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>emulator</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>external</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='backendVersion'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>2.0</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </tpm>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <redirdev supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='bus'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>usb</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </redirdev>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <channel supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='type'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>pty</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>unix</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </channel>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <crypto supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='model'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='type'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>qemu</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='backendModel'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>builtin</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </crypto>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <interface supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='backendType'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>default</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>passt</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </interface>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <panic supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='model'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>isa</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>hyperv</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </panic>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <console supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='type'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>null</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>vc</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>pty</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>dev</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>file</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>pipe</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>stdio</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>udp</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>tcp</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>unix</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>qemu-vdagent</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>dbus</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </console>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   </devices>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   <features>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <gic supported='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <vmcoreinfo supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <genid supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <backingStoreInput supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <backup supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <async-teardown supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <s390-pv supported='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <ps2 supported='yes'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <tdx supported='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <sev supported='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <sgx supported='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <hyperv supported='yes'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <enum name='features'>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>relaxed</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>vapic</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>spinlocks</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>vpindex</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>runtime</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>synic</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>stimer</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>reset</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>vendor_id</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>frequencies</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>reenlightenment</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>tlbflush</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>ipi</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>avic</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>emsr_bitmap</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <value>xmm_input</value>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </enum>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       <defaults>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <spinlocks>4095</spinlocks>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <stimer_direct>on</stimer_direct>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <tlbflush_direct>on</tlbflush_direct>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <tlbflush_extended>on</tlbflush_extended>
Jan 23 10:10:51 compute-2 nova_compute[224630]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 10:10:51 compute-2 nova_compute[224630]:       </defaults>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     </hyperv>
Jan 23 10:10:51 compute-2 nova_compute[224630]:     <launchSecurity supported='no'/>
Jan 23 10:10:51 compute-2 nova_compute[224630]:   </features>
Jan 23 10:10:51 compute-2 nova_compute[224630]: </domainCapabilities>
Jan 23 10:10:51 compute-2 nova_compute[224630]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 23 10:10:51 compute-2 nova_compute[224630]: 2026-01-23 10:10:51.516 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 23 10:10:51 compute-2 nova_compute[224630]: 2026-01-23 10:10:51.516 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 23 10:10:51 compute-2 nova_compute[224630]: 2026-01-23 10:10:51.516 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 23 10:10:51 compute-2 nova_compute[224630]: 2026-01-23 10:10:51.521 224634 INFO nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Secure Boot support detected
Jan 23 10:10:51 compute-2 nova_compute[224630]: 2026-01-23 10:10:51.524 224634 INFO nova.virt.libvirt.driver [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 23 10:10:51 compute-2 nova_compute[224630]: 2026-01-23 10:10:51.524 224634 INFO nova.virt.libvirt.driver [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 23 10:10:51 compute-2 nova_compute[224630]: 2026-01-23 10:10:51.532 224634 DEBUG nova.virt.libvirt.driver [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Jan 23 10:10:51 compute-2 nova_compute[224630]: 2026-01-23 10:10:51.560 224634 INFO nova.virt.node [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Determined node identity db762d15-510c-4120-bfc4-afe76b90b657 from /var/lib/nova/compute_id
Jan 23 10:10:51 compute-2 nova_compute[224630]: 2026-01-23 10:10:51.578 224634 WARNING nova.compute.manager [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Compute nodes ['db762d15-510c-4120-bfc4-afe76b90b657'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Jan 23 10:10:51 compute-2 nova_compute[224630]: 2026-01-23 10:10:51.605 224634 INFO nova.compute.manager [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 23 10:10:51 compute-2 nova_compute[224630]: 2026-01-23 10:10:51.637 224634 WARNING nova.compute.manager [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.
Jan 23 10:10:51 compute-2 nova_compute[224630]: 2026-01-23 10:10:51.638 224634 DEBUG oslo_concurrency.lockutils [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:10:51 compute-2 nova_compute[224630]: 2026-01-23 10:10:51.638 224634 DEBUG oslo_concurrency.lockutils [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:10:51 compute-2 nova_compute[224630]: 2026-01-23 10:10:51.638 224634 DEBUG oslo_concurrency.lockutils [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:10:51 compute-2 nova_compute[224630]: 2026-01-23 10:10:51.638 224634 DEBUG nova.compute.resource_tracker [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:10:51 compute-2 nova_compute[224630]: 2026-01-23 10:10:51.639 224634 DEBUG oslo_concurrency.processutils [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:10:51 compute-2 python3.9[225550]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 10:10:51 compute-2 systemd[1]: Stopping nova_compute container...
Jan 23 10:10:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:10:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:52.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:10:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:52 compute-2 podman[225400]: 2026-01-23 10:10:52.64089142 +0000 UTC m=+1.742877846 container create 7bf0ac2a3b0db2b226ff7b02cceefaa8070d70f7dbf7b4bd95e30e956430f2e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Jan 23 10:10:52 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a7d889ca7155de289c02ef8a64720d2cb4293985fa9132c6bb9ef15a832b68d/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:52 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a7d889ca7155de289c02ef8a64720d2cb4293985fa9132c6bb9ef15a832b68d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:52 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a7d889ca7155de289c02ef8a64720d2cb4293985fa9132c6bb9ef15a832b68d/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:52 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a7d889ca7155de289c02ef8a64720d2cb4293985fa9132c6bb9ef15a832b68d/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.tykohi-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:52 compute-2 podman[225400]: 2026-01-23 10:10:52.937327288 +0000 UTC m=+2.039313734 container init 7bf0ac2a3b0db2b226ff7b02cceefaa8070d70f7dbf7b4bd95e30e956430f2e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 10:10:52 compute-2 podman[225400]: 2026-01-23 10:10:52.945141159 +0000 UTC m=+2.047127585 container start 7bf0ac2a3b0db2b226ff7b02cceefaa8070d70f7dbf7b4bd95e30e956430f2e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Jan 23 10:10:52 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:10:52 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1198102383' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:10:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:10:52 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 10:10:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:10:52 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 10:10:53 compute-2 nova_compute[224630]: 2026-01-23 10:10:53.008 224634 DEBUG oslo_concurrency.processutils [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.369s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:10:53 compute-2 systemd[1]: Starting libvirt nodedev daemon...
Jan 23 10:10:53 compute-2 bash[225400]: 7bf0ac2a3b0db2b226ff7b02cceefaa8070d70f7dbf7b4bd95e30e956430f2e0
Jan 23 10:10:53 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:10:53 compute-2 systemd[1]: Started libvirt nodedev daemon.
Jan 23 10:10:53 compute-2 nova_compute[224630]: 2026-01-23 10:10:53.159 224634 DEBUG oslo_concurrency.lockutils [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:10:53 compute-2 nova_compute[224630]: 2026-01-23 10:10:53.160 224634 DEBUG oslo_concurrency.lockutils [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:10:53 compute-2 nova_compute[224630]: 2026-01-23 10:10:53.160 224634 DEBUG oslo_concurrency.lockutils [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:10:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:10:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:53.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:10:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:53 compute-2 virtqemud[225221]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 23 10:10:53 compute-2 virtqemud[225221]: hostname: compute-2
Jan 23 10:10:53 compute-2 virtqemud[225221]: End of file while reading data: Input/output error
Jan 23 10:10:53 compute-2 systemd[1]: libpod-f99dfdec5ed1b13c4bc030b38f1e4e63000d9ea357b09a559200a7501b063f8e.scope: Deactivated successfully.
Jan 23 10:10:53 compute-2 systemd[1]: libpod-f99dfdec5ed1b13c4bc030b38f1e4e63000d9ea357b09a559200a7501b063f8e.scope: Consumed 3.626s CPU time.
Jan 23 10:10:53 compute-2 podman[225565]: 2026-01-23 10:10:53.563129742 +0000 UTC m=+1.712954071 container died f99dfdec5ed1b13c4bc030b38f1e4e63000d9ea357b09a559200a7501b063f8e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 10:10:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:10:53 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 10:10:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:10:53 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 10:10:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:10:53 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 10:10:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:10:53 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 10:10:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:10:53 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 10:10:54 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:10:54 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/738253789' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:10:54 compute-2 ceph-mon[75771]: pgmap v519: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:10:54 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1010858224' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:10:54 compute-2 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f99dfdec5ed1b13c4bc030b38f1e4e63000d9ea357b09a559200a7501b063f8e-userdata-shm.mount: Deactivated successfully.
Jan 23 10:10:54 compute-2 systemd[1]: var-lib-containers-storage-overlay-e3c2a27604130ae0c99245d8cc90e749391c91a55d1093354a2b3deb7500644d-merged.mount: Deactivated successfully.
Jan 23 10:10:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:10:54 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:10:54 compute-2 podman[225565]: 2026-01-23 10:10:54.140308421 +0000 UTC m=+2.290132690 container cleanup f99dfdec5ed1b13c4bc030b38f1e4e63000d9ea357b09a559200a7501b063f8e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=nova_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm)
Jan 23 10:10:54 compute-2 podman[225565]: nova_compute
Jan 23 10:10:54 compute-2 podman[225673]: nova_compute
Jan 23 10:10:54 compute-2 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 23 10:10:54 compute-2 systemd[1]: Stopped nova_compute container.
Jan 23 10:10:54 compute-2 systemd[1]: Starting nova_compute container...
Jan 23 10:10:54 compute-2 systemd[1]: Started libcrun container.
Jan 23 10:10:54 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3c2a27604130ae0c99245d8cc90e749391c91a55d1093354a2b3deb7500644d/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:54 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3c2a27604130ae0c99245d8cc90e749391c91a55d1093354a2b3deb7500644d/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:54 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3c2a27604130ae0c99245d8cc90e749391c91a55d1093354a2b3deb7500644d/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:54 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3c2a27604130ae0c99245d8cc90e749391c91a55d1093354a2b3deb7500644d/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:54 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3c2a27604130ae0c99245d8cc90e749391c91a55d1093354a2b3deb7500644d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:54 compute-2 podman[225686]: 2026-01-23 10:10:54.320313906 +0000 UTC m=+0.096121034 container init f99dfdec5ed1b13c4bc030b38f1e4e63000d9ea357b09a559200a7501b063f8e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:10:54 compute-2 podman[225686]: 2026-01-23 10:10:54.331249634 +0000 UTC m=+0.107056752 container start f99dfdec5ed1b13c4bc030b38f1e4e63000d9ea357b09a559200a7501b063f8e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm)
Jan 23 10:10:54 compute-2 podman[225686]: nova_compute
Jan 23 10:10:54 compute-2 nova_compute[225701]: + sudo -E kolla_set_configs
Jan 23 10:10:54 compute-2 systemd[1]: Started nova_compute container.
Jan 23 10:10:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:54.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:54 compute-2 sudo[225548]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Validating config file
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Copying service configuration files
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Deleting /etc/ceph
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Creating directory /etc/ceph
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Setting permission for /etc/ceph
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Writing out command to execute
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 10:10:54 compute-2 nova_compute[225701]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 10:10:54 compute-2 nova_compute[225701]: ++ cat /run_command
Jan 23 10:10:54 compute-2 nova_compute[225701]: + CMD=nova-compute
Jan 23 10:10:54 compute-2 nova_compute[225701]: + ARGS=
Jan 23 10:10:54 compute-2 nova_compute[225701]: + sudo kolla_copy_cacerts
Jan 23 10:10:54 compute-2 nova_compute[225701]: + [[ ! -n '' ]]
Jan 23 10:10:54 compute-2 nova_compute[225701]: + . kolla_extend_start
Jan 23 10:10:54 compute-2 nova_compute[225701]: Running command: 'nova-compute'
Jan 23 10:10:54 compute-2 nova_compute[225701]: + echo 'Running command: '\''nova-compute'\'''
Jan 23 10:10:54 compute-2 nova_compute[225701]: + umask 0022
Jan 23 10:10:54 compute-2 nova_compute[225701]: + exec nova-compute
Jan 23 10:10:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:55 compute-2 ceph-mon[75771]: pgmap v520: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:55 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1198102383' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:10:55 compute-2 ceph-mon[75771]: pgmap v521: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:10:55 compute-2 sudo[225863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mafsppceklbvfsfabxflywbkqppcgoyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163054.8037481-3620-217754338515296/AnsiballZ_podman_container.py'
Jan 23 10:10:55 compute-2 sudo[225863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:55 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:10:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:10:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:55.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:10:55 compute-2 python3.9[225865]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 23 10:10:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:10:55.472 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:10:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:10:55.473 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:10:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:10:55.474 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:10:55 compute-2 systemd[1]: Started libpod-conmon-ab35065d5c90da8f59fd3b2ff3626e82eb030c0e98065938576df5f518cd313c.scope.
Jan 23 10:10:55 compute-2 systemd[1]: Started libcrun container.
Jan 23 10:10:55 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f22e8840e86e25e717c359cb474b35854cdc3e93e9623e9d87c066db60f0155/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:55 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f22e8840e86e25e717c359cb474b35854cdc3e93e9623e9d87c066db60f0155/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:55 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f22e8840e86e25e717c359cb474b35854cdc3e93e9623e9d87c066db60f0155/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:55 compute-2 podman[225889]: 2026-01-23 10:10:55.560347709 +0000 UTC m=+0.123913446 container init ab35065d5c90da8f59fd3b2ff3626e82eb030c0e98065938576df5f518cd313c (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 23 10:10:55 compute-2 podman[225889]: 2026-01-23 10:10:55.56766507 +0000 UTC m=+0.131230787 container start ab35065d5c90da8f59fd3b2ff3626e82eb030c0e98065938576df5f518cd313c (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 10:10:55 compute-2 python3.9[225865]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 23 10:10:55 compute-2 nova_compute_init[225911]: INFO:nova_statedir:Applying nova statedir ownership
Jan 23 10:10:55 compute-2 nova_compute_init[225911]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 23 10:10:55 compute-2 nova_compute_init[225911]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 23 10:10:55 compute-2 nova_compute_init[225911]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 23 10:10:55 compute-2 nova_compute_init[225911]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 23 10:10:55 compute-2 nova_compute_init[225911]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 23 10:10:55 compute-2 nova_compute_init[225911]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 23 10:10:55 compute-2 nova_compute_init[225911]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 23 10:10:55 compute-2 nova_compute_init[225911]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 23 10:10:55 compute-2 nova_compute_init[225911]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 23 10:10:55 compute-2 nova_compute_init[225911]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 23 10:10:55 compute-2 nova_compute_init[225911]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 23 10:10:55 compute-2 nova_compute_init[225911]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 23 10:10:55 compute-2 nova_compute_init[225911]: INFO:nova_statedir:Nova statedir ownership complete
Jan 23 10:10:55 compute-2 systemd[1]: libpod-ab35065d5c90da8f59fd3b2ff3626e82eb030c0e98065938576df5f518cd313c.scope: Deactivated successfully.
Jan 23 10:10:55 compute-2 podman[225912]: 2026-01-23 10:10:55.624311952 +0000 UTC m=+0.026961454 container died ab35065d5c90da8f59fd3b2ff3626e82eb030c0e98065938576df5f518cd313c (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Jan 23 10:10:55 compute-2 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ab35065d5c90da8f59fd3b2ff3626e82eb030c0e98065938576df5f518cd313c-userdata-shm.mount: Deactivated successfully.
Jan 23 10:10:55 compute-2 systemd[1]: var-lib-containers-storage-overlay-3f22e8840e86e25e717c359cb474b35854cdc3e93e9623e9d87c066db60f0155-merged.mount: Deactivated successfully.
Jan 23 10:10:55 compute-2 podman[225923]: 2026-01-23 10:10:55.680435472 +0000 UTC m=+0.047145921 container cleanup ab35065d5c90da8f59fd3b2ff3626e82eb030c0e98065938576df5f518cd313c (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 23 10:10:55 compute-2 systemd[1]: libpod-conmon-ab35065d5c90da8f59fd3b2ff3626e82eb030c0e98065938576df5f518cd313c.scope: Deactivated successfully.
Jan 23 10:10:55 compute-2 sudo[225863]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:10:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:56.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:10:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:56 compute-2 nova_compute[225701]: 2026-01-23 10:10:56.617 225706 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 23 10:10:56 compute-2 nova_compute[225701]: 2026-01-23 10:10:56.618 225706 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 23 10:10:56 compute-2 nova_compute[225701]: 2026-01-23 10:10:56.618 225706 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 23 10:10:56 compute-2 nova_compute[225701]: 2026-01-23 10:10:56.618 225706 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 23 10:10:56 compute-2 sshd-session[201590]: Connection closed by 192.168.122.30 port 41990
Jan 23 10:10:56 compute-2 sshd-session[201587]: pam_unix(sshd:session): session closed for user zuul
Jan 23 10:10:56 compute-2 systemd[1]: session-53.scope: Deactivated successfully.
Jan 23 10:10:56 compute-2 systemd[1]: session-53.scope: Consumed 1min 57.570s CPU time.
Jan 23 10:10:56 compute-2 systemd-logind[786]: Session 53 logged out. Waiting for processes to exit.
Jan 23 10:10:56 compute-2 systemd-logind[786]: Removed session 53.
Jan 23 10:10:56 compute-2 nova_compute[225701]: 2026-01-23 10:10:56.750 225706 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:10:56 compute-2 nova_compute[225701]: 2026-01-23 10:10:56.773 225706 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:10:56 compute-2 nova_compute[225701]: 2026-01-23 10:10:56.773 225706 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 23 10:10:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.204 225706 INFO nova.virt.driver [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.304 225706 INFO nova.compute.provider_config [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.313 225706 DEBUG oslo_concurrency.lockutils [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.313 225706 DEBUG oslo_concurrency.lockutils [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.313 225706 DEBUG oslo_concurrency.lockutils [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.314 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.314 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.314 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.314 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.314 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.314 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.315 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.315 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.315 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.315 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.315 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.315 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.315 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.316 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.316 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.316 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.316 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.316 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.316 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.316 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.317 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.317 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.317 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.317 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.317 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.317 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.317 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.318 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.318 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.318 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.318 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.318 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.318 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.318 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.319 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.319 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.319 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.319 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.319 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.319 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.319 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.320 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.320 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.320 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.320 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.320 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.321 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.321 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.321 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.321 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.321 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.322 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.322 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.322 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.322 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.322 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.322 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.323 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.323 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.323 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.323 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.323 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.323 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.323 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.323 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.324 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.324 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.324 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.324 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.324 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.324 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.324 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.325 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.325 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.325 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.325 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.325 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.325 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.325 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.326 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.326 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.326 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.326 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.326 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.326 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.326 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.327 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.327 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.327 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.327 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.327 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.327 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.327 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.328 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.328 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.328 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.328 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.328 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.328 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.328 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.328 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.329 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.329 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.329 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.329 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.329 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.329 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.329 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.330 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.330 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.330 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.330 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.330 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.330 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.330 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.331 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.331 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.331 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.331 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.331 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.331 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.331 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.332 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.332 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.332 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.332 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.332 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:57.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.332 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.332 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.333 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.333 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.333 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.333 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.333 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.333 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.334 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.334 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.334 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.334 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.334 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.334 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.335 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.335 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.335 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.335 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.335 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.335 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.336 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.336 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.336 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.336 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.336 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.336 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.337 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.337 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.337 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.337 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.337 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.337 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.338 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.338 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.338 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.338 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.338 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.338 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.338 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.339 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.339 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.339 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.339 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.339 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.339 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.340 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.340 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.340 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.340 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.340 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.340 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.340 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.341 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.341 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.341 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.341 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.341 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.341 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.341 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.342 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.342 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.342 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.342 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.342 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.343 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.343 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.343 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.343 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.343 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.343 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.344 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.344 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.344 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.344 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.344 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.344 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.344 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.345 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.345 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.345 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.345 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.345 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.345 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.345 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.345 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.346 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.346 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.346 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.346 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.346 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.346 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.347 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.347 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.347 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.347 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.347 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.347 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.347 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.348 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.348 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.348 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.348 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.348 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.348 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.348 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.349 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.349 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.349 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.349 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.349 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.349 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.350 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.350 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.350 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.350 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.350 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.350 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.350 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.351 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.351 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.351 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.351 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.351 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.351 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.351 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.352 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.352 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.352 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.352 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.352 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.352 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.352 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.352 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.353 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.353 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.353 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.353 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.353 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.353 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.353 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.354 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.354 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.354 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.354 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.354 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.354 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.354 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.355 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.355 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.355 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.355 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.355 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.355 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.355 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.356 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.356 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.356 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.356 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.356 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.356 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.356 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.356 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.357 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.357 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.357 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.357 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.357 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.357 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.357 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.358 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.358 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.358 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.358 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.358 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.358 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.358 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.359 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.359 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.359 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.359 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.359 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.359 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.359 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.360 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.360 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.360 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.360 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.360 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.360 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.360 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.361 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.361 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.361 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.361 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.361 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.361 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.361 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.361 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.362 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.362 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.362 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.362 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.362 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.363 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.363 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.363 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.363 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.363 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.364 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.364 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.364 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.364 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.364 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.365 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.365 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.365 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.365 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.365 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.366 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.366 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.366 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.366 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.367 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.367 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.367 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.367 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.367 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.368 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.368 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.368 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.368 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.368 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.369 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.369 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.369 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.369 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.369 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.369 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.369 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.370 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.370 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.370 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.370 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.370 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.370 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.370 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.371 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.371 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.371 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.371 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.371 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.371 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.371 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.372 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.372 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.372 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.372 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.372 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.372 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.372 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.373 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.373 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.373 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.373 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.373 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.373 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.373 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.374 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.374 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.374 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.374 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.374 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.374 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.375 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.375 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.375 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.375 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.375 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.375 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.376 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.376 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.376 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.376 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.376 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.377 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.377 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.377 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.377 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.377 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.378 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.378 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.378 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.378 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.378 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.379 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.379 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.379 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.379 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.380 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.380 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.380 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.380 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.380 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.381 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.381 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.381 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.381 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.381 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.382 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.382 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.382 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.382 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.382 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.383 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.383 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.383 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.383 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.383 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.384 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.384 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.384 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.384 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.384 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.385 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.385 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.385 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.385 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.385 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.386 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.386 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.386 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.386 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.386 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.387 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.387 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.387 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.387 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.387 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.388 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.388 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.388 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.388 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.389 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.389 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.389 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.389 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.389 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.390 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.390 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.390 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.390 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.390 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.391 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.391 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.391 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.391 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.391 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.392 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.392 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.392 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.392 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.393 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.393 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.393 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.393 225706 WARNING oslo_config.cfg [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 23 10:10:57 compute-2 nova_compute[225701]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 23 10:10:57 compute-2 nova_compute[225701]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 23 10:10:57 compute-2 nova_compute[225701]: and ``live_migration_inbound_addr`` respectively.
Jan 23 10:10:57 compute-2 nova_compute[225701]: ).  Its value may be silently ignored in the future.
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.394 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.394 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.394 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.394 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.394 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.395 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.395 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.395 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.395 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.396 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.396 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.396 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.396 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.396 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.397 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.397 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.397 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.397 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.397 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.rbd_secret_uuid        = f3005f84-239a-55b6-a948-8f1fb592b920 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.398 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.398 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.398 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.398 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.399 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.399 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.399 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.399 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.399 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.400 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.400 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.400 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.400 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.401 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.401 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.401 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.401 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.401 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.402 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.402 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.402 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.402 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.403 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.403 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.403 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.403 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.403 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.403 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.404 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.404 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.404 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.404 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.404 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.405 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.405 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.405 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.405 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.406 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.406 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.406 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.406 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.406 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.407 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.407 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.407 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.407 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.407 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.408 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.408 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.408 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.408 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.408 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.409 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.409 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.409 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.409 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.410 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.410 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.410 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.410 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.410 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.411 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.411 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.411 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.411 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.411 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.412 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.412 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.412 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.412 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.412 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.413 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.413 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.413 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.413 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.414 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.414 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.414 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.414 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.414 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.415 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.415 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.415 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.415 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.415 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.416 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.416 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.416 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.416 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.416 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.417 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.417 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.417 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.417 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.418 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.418 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.418 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.418 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.418 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.419 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.419 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.419 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.419 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.419 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.420 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.420 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.420 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.420 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.420 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.421 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.421 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.421 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.421 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.422 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.422 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.422 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.422 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.422 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.423 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.423 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.423 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.424 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.424 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.424 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.424 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.424 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.425 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.425 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.425 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.425 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.425 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.426 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.426 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.426 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.426 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.427 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.427 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.427 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.427 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.428 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.428 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.428 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.428 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.428 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.429 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.429 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.429 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.429 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.429 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.430 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.430 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.430 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.430 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.430 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.431 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.431 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.431 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.431 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.431 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.432 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.432 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.432 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.432 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.433 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.433 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.433 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.433 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.433 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.434 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.434 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.434 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.434 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.434 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.435 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.435 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.435 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.435 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.435 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.436 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.436 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.436 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.436 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.437 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.437 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.437 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.437 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.437 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.438 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.438 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.438 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.438 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.438 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.439 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.439 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.439 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.439 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.439 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.440 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.440 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.440 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.440 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.440 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.441 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.441 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.441 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.441 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.441 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.442 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.442 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.442 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.442 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.443 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.443 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.443 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.443 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.444 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.444 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.444 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.444 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.444 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.445 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.445 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.445 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.445 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.445 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.446 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.446 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.446 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.446 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.447 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.447 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.447 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.448 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.448 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.448 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.448 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.448 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.449 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.449 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.449 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.449 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.449 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.450 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.450 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.450 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.450 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.451 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.451 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.451 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.451 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.451 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.452 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.452 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.452 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.452 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.453 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.453 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.453 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.453 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.453 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.454 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.454 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.454 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.454 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.454 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.455 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.455 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.455 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.455 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.456 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.456 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.456 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.456 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.456 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.457 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.457 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.457 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.457 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.458 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.458 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.458 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.458 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.458 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.459 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.459 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.459 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.459 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.460 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.460 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.460 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.460 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.460 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.460 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.461 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.461 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.461 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.461 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.461 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.462 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.462 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.462 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.462 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.463 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.463 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.463 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.463 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.463 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.464 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.464 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.464 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.464 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.465 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.465 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.465 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.466 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.466 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.466 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.466 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.467 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.467 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.467 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.467 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.467 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.468 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.468 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.468 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.468 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.468 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.469 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.469 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.469 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.469 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.469 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.470 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.470 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.470 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.470 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.470 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.471 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.471 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.471 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.471 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.471 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.472 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.472 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.472 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.472 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.472 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.473 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.473 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.473 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.473 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.473 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.474 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.474 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.474 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.474 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.475 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.475 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.475 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.475 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.475 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.476 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.476 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.476 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.476 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.477 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.477 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.477 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.477 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.477 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.478 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.478 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.478 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.478 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.479 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.479 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.479 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.479 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.479 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.480 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.480 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.480 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.480 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.481 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.481 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.481 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.481 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.481 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.482 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.482 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.482 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.482 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.482 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.483 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.495 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.496 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.496 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.496 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.496 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.497 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.497 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.497 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.497 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.497 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.499 225706 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.515 225706 INFO nova.virt.node [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Determined node identity db762d15-510c-4120-bfc4-afe76b90b657 from /var/lib/nova/compute_id
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.516 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.516 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.517 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.517 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.534 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f563b9cf5b0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.537 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f563b9cf5b0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.538 225706 INFO nova.virt.libvirt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Connection event '1' reason 'None'
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.546 225706 INFO nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Libvirt host capabilities <capabilities>
Jan 23 10:10:57 compute-2 nova_compute[225701]: 
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <host>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <uuid>84c28ede-4112-4d76-8f99-c7405a7d029c</uuid>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <cpu>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <arch>x86_64</arch>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model>EPYC-Rome-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <vendor>AMD</vendor>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <microcode version='16777317'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <signature family='23' model='49' stepping='0'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <maxphysaddr mode='emulate' bits='40'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature name='x2apic'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature name='tsc-deadline'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature name='osxsave'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature name='hypervisor'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature name='tsc_adjust'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature name='spec-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature name='stibp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature name='arch-capabilities'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature name='ssbd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature name='cmp_legacy'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature name='topoext'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature name='virt-ssbd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature name='lbrv'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature name='tsc-scale'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature name='vmcb-clean'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature name='pause-filter'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature name='pfthreshold'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature name='svme-addr-chk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature name='rdctl-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature name='skip-l1dfl-vmentry'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature name='mds-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature name='pschange-mc-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <pages unit='KiB' size='4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <pages unit='KiB' size='2048'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <pages unit='KiB' size='1048576'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </cpu>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <power_management>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <suspend_mem/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </power_management>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <iommu support='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <migration_features>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <live/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <uri_transports>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <uri_transport>tcp</uri_transport>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <uri_transport>rdma</uri_transport>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </uri_transports>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </migration_features>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <topology>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <cells num='1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <cell id='0'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:           <memory unit='KiB'>7864316</memory>
Jan 23 10:10:57 compute-2 nova_compute[225701]:           <pages unit='KiB' size='4'>1966079</pages>
Jan 23 10:10:57 compute-2 nova_compute[225701]:           <pages unit='KiB' size='2048'>0</pages>
Jan 23 10:10:57 compute-2 nova_compute[225701]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 23 10:10:57 compute-2 nova_compute[225701]:           <distances>
Jan 23 10:10:57 compute-2 nova_compute[225701]:             <sibling id='0' value='10'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:           </distances>
Jan 23 10:10:57 compute-2 nova_compute[225701]:           <cpus num='8'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:           </cpus>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         </cell>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </cells>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </topology>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <cache>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </cache>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <secmodel>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model>selinux</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <doi>0</doi>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </secmodel>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <secmodel>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model>dac</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <doi>0</doi>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </secmodel>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   </host>
Jan 23 10:10:57 compute-2 nova_compute[225701]: 
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <guest>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <os_type>hvm</os_type>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <arch name='i686'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <wordsize>32</wordsize>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <domain type='qemu'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <domain type='kvm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </arch>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <features>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <pae/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <nonpae/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <acpi default='on' toggle='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <apic default='on' toggle='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <cpuselection/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <deviceboot/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <disksnapshot default='on' toggle='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <externalSnapshot/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </features>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   </guest>
Jan 23 10:10:57 compute-2 nova_compute[225701]: 
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <guest>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <os_type>hvm</os_type>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <arch name='x86_64'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <wordsize>64</wordsize>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <domain type='qemu'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <domain type='kvm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </arch>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <features>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <acpi default='on' toggle='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <apic default='on' toggle='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <cpuselection/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <deviceboot/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <disksnapshot default='on' toggle='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <externalSnapshot/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </features>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   </guest>
Jan 23 10:10:57 compute-2 nova_compute[225701]: 
Jan 23 10:10:57 compute-2 nova_compute[225701]: </capabilities>
Jan 23 10:10:57 compute-2 nova_compute[225701]: 
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.554 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.561 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 23 10:10:57 compute-2 nova_compute[225701]: <domainCapabilities>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <path>/usr/libexec/qemu-kvm</path>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <domain>kvm</domain>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <arch>i686</arch>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <vcpu max='4096'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <iothreads supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <os supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <enum name='firmware'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <loader supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='type'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>rom</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>pflash</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='readonly'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>yes</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>no</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='secure'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>no</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </loader>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   </os>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <cpu>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <mode name='host-passthrough' supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='hostPassthroughMigratable'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>on</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>off</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </mode>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <mode name='maximum' supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='maximumMigratable'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>on</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>off</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </mode>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <mode name='host-model' supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <vendor>AMD</vendor>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='x2apic'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='tsc-deadline'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='hypervisor'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='tsc_adjust'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='spec-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='stibp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='ssbd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='cmp_legacy'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='overflow-recov'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='succor'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='ibrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='amd-ssbd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='virt-ssbd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='lbrv'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='tsc-scale'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='vmcb-clean'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='flushbyasid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='pause-filter'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='pfthreshold'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='svme-addr-chk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='disable' name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </mode>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <mode name='custom' supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-noTSX'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 ceph-mon[75771]: pgmap v522: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server-v5'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='ClearwaterForest'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bhi-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ddpd-u'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='lam'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sha512'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sm3'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sm4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='ClearwaterForest-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bhi-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ddpd-u'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='lam'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sha512'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sm3'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sm4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cooperlake'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cooperlake-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cooperlake-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Denverton'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Denverton-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Denverton-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Denverton-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Dhyana-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Genoa'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Genoa-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Genoa-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='perfmon-v2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Milan'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Milan-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Milan-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Milan-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Rome'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Rome-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Rome-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Rome-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Turin'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='perfmon-v2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbpb'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Turin-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='perfmon-v2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbpb'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-v5'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='GraniteRapids'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='GraniteRapids-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:10:57 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:10:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:10:57 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='GraniteRapids-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10-128'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10-256'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10-512'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='GraniteRapids-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10-128'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10-256'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10-512'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-noTSX'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-noTSX'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v5'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v6'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v7'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='IvyBridge'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='IvyBridge-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='IvyBridge-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='IvyBridge-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='KnightsMill'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512er'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512pf'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='KnightsMill-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512er'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512pf'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Opteron_G4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xop'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Opteron_G4-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xop'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Opteron_G5'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tbm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xop'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Opteron_G5-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tbm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xop'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SapphireRapids'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SapphireRapids-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SapphireRapids-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SapphireRapids-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SapphireRapids-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SierraForest'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SierraForest-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SierraForest-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='lam'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SierraForest-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='lam'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-v5'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Snowridge'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Snowridge-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Snowridge-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Snowridge-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Snowridge-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='athlon'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='athlon-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='core2duo'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='core2duo-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='coreduo'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='coreduo-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='n270'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='n270-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='phenom'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='phenom-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </mode>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   </cpu>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <memoryBacking supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <enum name='sourceType'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <value>file</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <value>anonymous</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <value>memfd</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   </memoryBacking>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <devices>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <disk supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='diskDevice'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>disk</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>cdrom</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>floppy</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>lun</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='bus'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>fdc</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>scsi</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>usb</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>sata</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='model'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio-transitional</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio-non-transitional</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </disk>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <graphics supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='type'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>vnc</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>egl-headless</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>dbus</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </graphics>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <video supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='modelType'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>vga</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>cirrus</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>none</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>bochs</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>ramfb</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </video>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <hostdev supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='mode'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>subsystem</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='startupPolicy'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>default</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>mandatory</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>requisite</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>optional</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='subsysType'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>usb</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>pci</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>scsi</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='capsType'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='pciBackend'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </hostdev>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <rng supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='model'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio-transitional</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio-non-transitional</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='backendModel'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>random</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>egd</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>builtin</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </rng>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <filesystem supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='driverType'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>path</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>handle</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtiofs</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </filesystem>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <tpm supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='model'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>tpm-tis</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>tpm-crb</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='backendModel'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>emulator</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>external</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='backendVersion'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>2.0</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </tpm>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <redirdev supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='bus'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>usb</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </redirdev>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <channel supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='type'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>pty</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>unix</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </channel>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <crypto supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='model'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='type'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>qemu</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='backendModel'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>builtin</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </crypto>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <interface supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='backendType'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>default</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>passt</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </interface>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <panic supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='model'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>isa</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>hyperv</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </panic>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <console supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='type'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>null</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>vc</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>pty</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>dev</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>file</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>pipe</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>stdio</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>udp</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>tcp</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>unix</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>qemu-vdagent</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>dbus</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </console>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   </devices>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <features>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <gic supported='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <vmcoreinfo supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <genid supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <backingStoreInput supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <backup supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <async-teardown supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <s390-pv supported='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <ps2 supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <tdx supported='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <sev supported='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <sgx supported='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <hyperv supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='features'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>relaxed</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>vapic</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>spinlocks</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>vpindex</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>runtime</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>synic</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>stimer</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>reset</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>vendor_id</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>frequencies</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>reenlightenment</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>tlbflush</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>ipi</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>avic</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>emsr_bitmap</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>xmm_input</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <defaults>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <spinlocks>4095</spinlocks>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <stimer_direct>on</stimer_direct>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <tlbflush_direct>on</tlbflush_direct>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <tlbflush_extended>on</tlbflush_extended>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </defaults>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </hyperv>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <launchSecurity supported='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   </features>
Jan 23 10:10:57 compute-2 nova_compute[225701]: </domainCapabilities>
Jan 23 10:10:57 compute-2 nova_compute[225701]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.571 225706 DEBUG nova.virt.libvirt.volume.mount [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.576 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 23 10:10:57 compute-2 nova_compute[225701]: <domainCapabilities>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <path>/usr/libexec/qemu-kvm</path>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <domain>kvm</domain>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <arch>i686</arch>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <vcpu max='240'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <iothreads supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <os supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <enum name='firmware'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <loader supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='type'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>rom</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>pflash</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='readonly'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>yes</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>no</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='secure'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>no</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </loader>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   </os>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <cpu>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <mode name='host-passthrough' supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='hostPassthroughMigratable'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>on</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>off</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </mode>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <mode name='maximum' supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='maximumMigratable'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>on</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>off</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </mode>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <mode name='host-model' supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <vendor>AMD</vendor>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='x2apic'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='tsc-deadline'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='hypervisor'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='tsc_adjust'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='spec-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='stibp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='ssbd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='cmp_legacy'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='overflow-recov'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='succor'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='ibrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='amd-ssbd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='virt-ssbd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='lbrv'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='tsc-scale'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='vmcb-clean'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='flushbyasid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='pause-filter'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='pfthreshold'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='svme-addr-chk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='disable' name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </mode>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <mode name='custom' supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-noTSX'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server-v5'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='ClearwaterForest'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bhi-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ddpd-u'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='lam'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sha512'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sm3'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sm4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='ClearwaterForest-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bhi-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ddpd-u'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='lam'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sha512'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sm3'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sm4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cooperlake'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cooperlake-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cooperlake-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Denverton'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Denverton-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Denverton-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Denverton-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Dhyana-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Genoa'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Genoa-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Genoa-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='perfmon-v2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Milan'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Milan-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Milan-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Milan-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Rome'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Rome-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Rome-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Rome-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Turin'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='perfmon-v2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbpb'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Turin-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='perfmon-v2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbpb'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-v5'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='GraniteRapids'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='GraniteRapids-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='GraniteRapids-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10-128'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10-256'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10-512'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='GraniteRapids-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10-128'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10-256'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10-512'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-noTSX'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-noTSX'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v5'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v6'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v7'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='IvyBridge'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='IvyBridge-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='IvyBridge-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='IvyBridge-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='KnightsMill'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512er'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512pf'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='KnightsMill-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512er'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512pf'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Opteron_G4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xop'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Opteron_G4-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xop'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Opteron_G5'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tbm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xop'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Opteron_G5-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tbm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xop'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SapphireRapids'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SapphireRapids-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SapphireRapids-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SapphireRapids-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SapphireRapids-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SierraForest'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SierraForest-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SierraForest-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='lam'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SierraForest-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='lam'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-v5'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Snowridge'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Snowridge-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Snowridge-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Snowridge-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Snowridge-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='athlon'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='athlon-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='core2duo'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='core2duo-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='coreduo'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='coreduo-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='n270'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='n270-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='phenom'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='phenom-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </mode>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   </cpu>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <memoryBacking supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <enum name='sourceType'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <value>file</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <value>anonymous</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <value>memfd</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   </memoryBacking>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <devices>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <disk supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='diskDevice'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>disk</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>cdrom</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>floppy</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>lun</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='bus'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>ide</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>fdc</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>scsi</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>usb</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>sata</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='model'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio-transitional</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio-non-transitional</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </disk>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <graphics supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='type'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>vnc</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>egl-headless</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>dbus</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </graphics>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <video supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='modelType'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>vga</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>cirrus</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>none</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>bochs</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>ramfb</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </video>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <hostdev supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='mode'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>subsystem</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='startupPolicy'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>default</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>mandatory</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>requisite</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>optional</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='subsysType'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>usb</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>pci</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>scsi</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='capsType'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='pciBackend'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </hostdev>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <rng supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='model'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio-transitional</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio-non-transitional</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='backendModel'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>random</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>egd</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>builtin</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </rng>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <filesystem supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='driverType'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>path</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>handle</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtiofs</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </filesystem>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <tpm supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='model'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>tpm-tis</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>tpm-crb</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='backendModel'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>emulator</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>external</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='backendVersion'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>2.0</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </tpm>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <redirdev supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='bus'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>usb</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </redirdev>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <channel supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='type'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>pty</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>unix</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </channel>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <crypto supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='model'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='type'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>qemu</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='backendModel'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>builtin</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </crypto>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <interface supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='backendType'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>default</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>passt</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </interface>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <panic supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='model'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>isa</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>hyperv</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </panic>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <console supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='type'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>null</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>vc</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>pty</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>dev</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>file</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>pipe</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>stdio</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>udp</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>tcp</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>unix</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>qemu-vdagent</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>dbus</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </console>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   </devices>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <features>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <gic supported='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <vmcoreinfo supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <genid supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <backingStoreInput supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <backup supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <async-teardown supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <s390-pv supported='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <ps2 supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <tdx supported='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <sev supported='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <sgx supported='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <hyperv supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='features'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>relaxed</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>vapic</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>spinlocks</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>vpindex</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>runtime</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>synic</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>stimer</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>reset</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>vendor_id</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>frequencies</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>reenlightenment</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>tlbflush</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>ipi</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>avic</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>emsr_bitmap</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>xmm_input</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <defaults>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <spinlocks>4095</spinlocks>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <stimer_direct>on</stimer_direct>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <tlbflush_direct>on</tlbflush_direct>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <tlbflush_extended>on</tlbflush_extended>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </defaults>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </hyperv>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <launchSecurity supported='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   </features>
Jan 23 10:10:57 compute-2 nova_compute[225701]: </domainCapabilities>
Jan 23 10:10:57 compute-2 nova_compute[225701]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.664 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.669 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 23 10:10:57 compute-2 nova_compute[225701]: <domainCapabilities>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <path>/usr/libexec/qemu-kvm</path>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <domain>kvm</domain>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <arch>x86_64</arch>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <vcpu max='4096'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <iothreads supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <os supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <enum name='firmware'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <value>efi</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <loader supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='type'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>rom</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>pflash</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='readonly'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>yes</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>no</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='secure'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>yes</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>no</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </loader>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   </os>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <cpu>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <mode name='host-passthrough' supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='hostPassthroughMigratable'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>on</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>off</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </mode>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <mode name='maximum' supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='maximumMigratable'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>on</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>off</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </mode>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <mode name='host-model' supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <vendor>AMD</vendor>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='x2apic'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='tsc-deadline'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='hypervisor'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='tsc_adjust'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='spec-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='stibp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='ssbd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='cmp_legacy'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='overflow-recov'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='succor'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='ibrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='amd-ssbd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='virt-ssbd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='lbrv'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='tsc-scale'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='vmcb-clean'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='flushbyasid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='pause-filter'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='pfthreshold'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='svme-addr-chk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='disable' name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </mode>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <mode name='custom' supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-noTSX'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server-v5'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='ClearwaterForest'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bhi-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ddpd-u'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='lam'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sha512'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sm3'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sm4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='ClearwaterForest-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bhi-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ddpd-u'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='lam'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sha512'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sm3'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sm4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cooperlake'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cooperlake-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cooperlake-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Denverton'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Denverton-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Denverton-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Denverton-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Dhyana-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Genoa'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Genoa-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Genoa-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='perfmon-v2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Milan'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Milan-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Milan-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Milan-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Rome'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Rome-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Rome-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Rome-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Turin'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='perfmon-v2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbpb'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Turin-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='perfmon-v2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbpb'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-v5'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='GraniteRapids'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='GraniteRapids-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='GraniteRapids-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10-128'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10-256'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10-512'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='GraniteRapids-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10-128'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10-256'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10-512'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-noTSX'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-noTSX'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v5'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v6'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v7'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='IvyBridge'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='IvyBridge-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='IvyBridge-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='IvyBridge-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='KnightsMill'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512er'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512pf'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='KnightsMill-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512er'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512pf'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Opteron_G4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xop'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Opteron_G4-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xop'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Opteron_G5'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tbm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xop'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Opteron_G5-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tbm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xop'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SapphireRapids'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SapphireRapids-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SapphireRapids-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SapphireRapids-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SapphireRapids-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SierraForest'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SierraForest-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SierraForest-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='lam'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SierraForest-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='lam'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-v5'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Snowridge'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Snowridge-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Snowridge-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Snowridge-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Snowridge-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='athlon'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='athlon-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='core2duo'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='core2duo-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='coreduo'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='coreduo-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='n270'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='n270-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='phenom'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='phenom-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </mode>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   </cpu>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <memoryBacking supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <enum name='sourceType'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <value>file</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <value>anonymous</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <value>memfd</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   </memoryBacking>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <devices>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <disk supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='diskDevice'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>disk</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>cdrom</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>floppy</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>lun</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='bus'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>fdc</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>scsi</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>usb</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>sata</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='model'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio-transitional</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio-non-transitional</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </disk>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <graphics supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='type'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>vnc</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>egl-headless</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>dbus</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </graphics>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <video supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='modelType'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>vga</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>cirrus</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>none</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>bochs</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>ramfb</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </video>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <hostdev supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='mode'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>subsystem</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='startupPolicy'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>default</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>mandatory</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>requisite</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>optional</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='subsysType'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>usb</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>pci</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>scsi</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='capsType'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='pciBackend'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </hostdev>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <rng supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='model'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio-transitional</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio-non-transitional</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='backendModel'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>random</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>egd</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>builtin</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </rng>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <filesystem supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='driverType'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>path</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>handle</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtiofs</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </filesystem>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <tpm supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='model'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>tpm-tis</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>tpm-crb</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='backendModel'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>emulator</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>external</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='backendVersion'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>2.0</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </tpm>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <redirdev supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='bus'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>usb</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </redirdev>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <channel supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='type'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>pty</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>unix</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </channel>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <crypto supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='model'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='type'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>qemu</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='backendModel'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>builtin</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </crypto>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <interface supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='backendType'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>default</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>passt</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </interface>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <panic supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='model'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>isa</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>hyperv</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </panic>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <console supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='type'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>null</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>vc</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>pty</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>dev</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>file</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>pipe</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>stdio</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>udp</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>tcp</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>unix</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>qemu-vdagent</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>dbus</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </console>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   </devices>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <features>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <gic supported='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <vmcoreinfo supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <genid supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <backingStoreInput supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <backup supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <async-teardown supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <s390-pv supported='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <ps2 supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <tdx supported='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <sev supported='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <sgx supported='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <hyperv supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='features'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>relaxed</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>vapic</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>spinlocks</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>vpindex</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>runtime</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>synic</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>stimer</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>reset</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>vendor_id</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>frequencies</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>reenlightenment</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>tlbflush</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>ipi</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>avic</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>emsr_bitmap</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>xmm_input</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <defaults>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <spinlocks>4095</spinlocks>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <stimer_direct>on</stimer_direct>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <tlbflush_direct>on</tlbflush_direct>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <tlbflush_extended>on</tlbflush_extended>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </defaults>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </hyperv>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <launchSecurity supported='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   </features>
Jan 23 10:10:57 compute-2 nova_compute[225701]: </domainCapabilities>
Jan 23 10:10:57 compute-2 nova_compute[225701]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.752 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 23 10:10:57 compute-2 nova_compute[225701]: <domainCapabilities>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <path>/usr/libexec/qemu-kvm</path>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <domain>kvm</domain>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <arch>x86_64</arch>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <vcpu max='240'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <iothreads supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <os supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <enum name='firmware'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <loader supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='type'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>rom</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>pflash</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='readonly'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>yes</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>no</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='secure'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>no</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </loader>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   </os>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <cpu>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <mode name='host-passthrough' supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='hostPassthroughMigratable'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>on</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>off</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </mode>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <mode name='maximum' supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='maximumMigratable'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>on</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>off</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </mode>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <mode name='host-model' supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <vendor>AMD</vendor>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='x2apic'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='tsc-deadline'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='hypervisor'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='tsc_adjust'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='spec-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='stibp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='ssbd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='cmp_legacy'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='overflow-recov'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='succor'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='ibrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='amd-ssbd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='virt-ssbd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='lbrv'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='tsc-scale'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='vmcb-clean'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='flushbyasid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='pause-filter'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='pfthreshold'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='svme-addr-chk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <feature policy='disable' name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </mode>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <mode name='custom' supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-noTSX'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Broadwell-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cascadelake-Server-v5'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='ClearwaterForest'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bhi-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ddpd-u'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='lam'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sha512'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sm3'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sm4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='ClearwaterForest-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bhi-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ddpd-u'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='lam'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sha512'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sm3'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sm4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cooperlake'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cooperlake-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Cooperlake-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Denverton'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Denverton-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Denverton-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Denverton-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Dhyana-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Genoa'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Genoa-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Genoa-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='perfmon-v2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Milan'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Milan-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Milan-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Milan-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Rome'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Rome-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Rome-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Rome-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Turin'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='perfmon-v2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbpb'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-Turin-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='perfmon-v2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbpb'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='EPYC-v5'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='GraniteRapids'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='GraniteRapids-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='GraniteRapids-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10-128'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10-256'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10-512'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='GraniteRapids-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10-128'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10-256'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx10-512'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-noTSX'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Haswell-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-noTSX'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v5'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v6'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Icelake-Server-v7'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='IvyBridge'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='IvyBridge-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='IvyBridge-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='IvyBridge-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='KnightsMill'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512er'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512pf'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='KnightsMill-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512er'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512pf'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Opteron_G4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xop'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Opteron_G4-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xop'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Opteron_G5'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tbm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xop'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Opteron_G5-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tbm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xop'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SapphireRapids'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SapphireRapids-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SapphireRapids-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SapphireRapids-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SapphireRapids-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='la57'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SierraForest'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SierraForest-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SierraForest-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='lam'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='SierraForest-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='lam'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Client-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='hle'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Skylake-Server-v5'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='pku'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Snowridge'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Snowridge-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Snowridge-v2'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Snowridge-v3'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='Snowridge-v4'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='erms'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='athlon'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='athlon-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='core2duo'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='core2duo-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='coreduo'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='coreduo-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='n270'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='n270-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='ss'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='phenom'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <blockers model='phenom-v1'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </blockers>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </mode>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   </cpu>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <memoryBacking supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <enum name='sourceType'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <value>file</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <value>anonymous</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <value>memfd</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   </memoryBacking>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <devices>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <disk supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='diskDevice'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>disk</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>cdrom</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>floppy</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>lun</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='bus'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>ide</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>fdc</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>scsi</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>usb</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>sata</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='model'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio-transitional</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio-non-transitional</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </disk>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <graphics supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='type'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>vnc</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>egl-headless</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>dbus</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </graphics>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <video supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='modelType'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>vga</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>cirrus</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>none</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>bochs</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>ramfb</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </video>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <hostdev supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='mode'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>subsystem</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='startupPolicy'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>default</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>mandatory</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>requisite</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>optional</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='subsysType'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>usb</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>pci</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>scsi</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='capsType'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='pciBackend'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </hostdev>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <rng supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='model'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio-transitional</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtio-non-transitional</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='backendModel'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>random</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>egd</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>builtin</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </rng>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <filesystem supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='driverType'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>path</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>handle</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>virtiofs</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </filesystem>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <tpm supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='model'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>tpm-tis</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>tpm-crb</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='backendModel'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>emulator</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>external</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='backendVersion'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>2.0</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </tpm>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <redirdev supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='bus'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>usb</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </redirdev>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <channel supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='type'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>pty</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>unix</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </channel>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <crypto supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='model'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='type'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>qemu</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='backendModel'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>builtin</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </crypto>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <interface supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='backendType'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>default</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>passt</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </interface>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <panic supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='model'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>isa</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>hyperv</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </panic>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <console supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='type'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>null</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>vc</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>pty</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>dev</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>file</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>pipe</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>stdio</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>udp</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>tcp</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>unix</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>qemu-vdagent</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>dbus</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </console>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   </devices>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   <features>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <gic supported='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <vmcoreinfo supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <genid supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <backingStoreInput supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <backup supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <async-teardown supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <s390-pv supported='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <ps2 supported='yes'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <tdx supported='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <sev supported='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <sgx supported='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <hyperv supported='yes'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <enum name='features'>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>relaxed</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>vapic</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>spinlocks</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>vpindex</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>runtime</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>synic</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>stimer</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>reset</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>vendor_id</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>frequencies</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>reenlightenment</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>tlbflush</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>ipi</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>avic</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>emsr_bitmap</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <value>xmm_input</value>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </enum>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       <defaults>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <spinlocks>4095</spinlocks>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <stimer_direct>on</stimer_direct>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <tlbflush_direct>on</tlbflush_direct>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <tlbflush_extended>on</tlbflush_extended>
Jan 23 10:10:57 compute-2 nova_compute[225701]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 10:10:57 compute-2 nova_compute[225701]:       </defaults>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     </hyperv>
Jan 23 10:10:57 compute-2 nova_compute[225701]:     <launchSecurity supported='no'/>
Jan 23 10:10:57 compute-2 nova_compute[225701]:   </features>
Jan 23 10:10:57 compute-2 nova_compute[225701]: </domainCapabilities>
Jan 23 10:10:57 compute-2 nova_compute[225701]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.832 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.832 225706 INFO nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Secure Boot support detected
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.834 225706 INFO nova.virt.libvirt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.835 225706 INFO nova.virt.libvirt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.844 225706 DEBUG nova.virt.libvirt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.860 225706 INFO nova.virt.node [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Determined node identity db762d15-510c-4120-bfc4-afe76b90b657 from /var/lib/nova/compute_id
Jan 23 10:10:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.872 225706 WARNING nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Compute nodes ['db762d15-510c-4120-bfc4-afe76b90b657'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.893 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.912 225706 WARNING nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.912 225706 DEBUG oslo_concurrency.lockutils [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.912 225706 DEBUG oslo_concurrency.lockutils [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.913 225706 DEBUG oslo_concurrency.lockutils [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.913 225706 DEBUG nova.compute.resource_tracker [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:10:57 compute-2 nova_compute[225701]: 2026-01-23 10:10:57.913 225706 DEBUG oslo_concurrency.processutils [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:10:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:10:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:58.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:10:58 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:10:58 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/758454433' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:10:58 compute-2 nova_compute[225701]: 2026-01-23 10:10:58.381 225706 DEBUG oslo_concurrency.processutils [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:10:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:58 compute-2 nova_compute[225701]: 2026-01-23 10:10:58.531 225706 WARNING nova.virt.libvirt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:10:58 compute-2 nova_compute[225701]: 2026-01-23 10:10:58.532 225706 DEBUG nova.compute.resource_tracker [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5271MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:10:58 compute-2 nova_compute[225701]: 2026-01-23 10:10:58.532 225706 DEBUG oslo_concurrency.lockutils [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:10:58 compute-2 nova_compute[225701]: 2026-01-23 10:10:58.532 225706 DEBUG oslo_concurrency.lockutils [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:10:58 compute-2 nova_compute[225701]: 2026-01-23 10:10:58.545 225706 WARNING nova.compute.resource_tracker [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] No compute node record for compute-2.ctlplane.example.com:db762d15-510c-4120-bfc4-afe76b90b657: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host db762d15-510c-4120-bfc4-afe76b90b657 could not be found.
Jan 23 10:10:58 compute-2 rsyslogd[1004]: imjournal from <np0005593295:nova_compute>: begin to drop messages due to rate-limiting
Jan 23 10:10:58 compute-2 nova_compute[225701]: 2026-01-23 10:10:58.569 225706 INFO nova.compute.resource_tracker [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Compute node record created for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com with uuid: db762d15-510c-4120-bfc4-afe76b90b657
Jan 23 10:10:58 compute-2 nova_compute[225701]: 2026-01-23 10:10:58.628 225706 DEBUG nova.compute.resource_tracker [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:10:58 compute-2 nova_compute[225701]: 2026-01-23 10:10:58.628 225706 DEBUG nova.compute.resource_tracker [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:10:58 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1853406190' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:10:58 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/758454433' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:10:58 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1515803724' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:10:58 compute-2 nova_compute[225701]: 2026-01-23 10:10:58.815 225706 INFO nova.scheduler.client.report [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [req-6481bd6a-f476-4cd9-8dde-2dec94556bb7] Created resource provider record via placement API for resource provider with UUID db762d15-510c-4120-bfc4-afe76b90b657 and name compute-2.ctlplane.example.com.
Jan 23 10:10:58 compute-2 nova_compute[225701]: 2026-01-23 10:10:58.850 225706 DEBUG oslo_concurrency.processutils [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:10:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:10:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:59.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:59 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:10:59 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2360008108' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:10:59 compute-2 nova_compute[225701]: 2026-01-23 10:10:59.369 225706 DEBUG oslo_concurrency.processutils [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.519s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:10:59 compute-2 nova_compute[225701]: 2026-01-23 10:10:59.374 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 23 10:10:59 compute-2 nova_compute[225701]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Jan 23 10:10:59 compute-2 nova_compute[225701]: 2026-01-23 10:10:59.375 225706 INFO nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] kernel doesn't support AMD SEV
Jan 23 10:10:59 compute-2 nova_compute[225701]: 2026-01-23 10:10:59.376 225706 DEBUG nova.compute.provider_tree [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Updating inventory in ProviderTree for provider db762d15-510c-4120-bfc4-afe76b90b657 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 10:10:59 compute-2 nova_compute[225701]: 2026-01-23 10:10:59.377 225706 DEBUG nova.virt.libvirt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 10:10:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:10:59 compute-2 nova_compute[225701]: 2026-01-23 10:10:59.467 225706 DEBUG nova.scheduler.client.report [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Updated inventory for provider db762d15-510c-4120-bfc4-afe76b90b657 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Jan 23 10:10:59 compute-2 nova_compute[225701]: 2026-01-23 10:10:59.467 225706 DEBUG nova.compute.provider_tree [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Updating resource provider db762d15-510c-4120-bfc4-afe76b90b657 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 23 10:10:59 compute-2 nova_compute[225701]: 2026-01-23 10:10:59.468 225706 DEBUG nova.compute.provider_tree [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Updating inventory in ProviderTree for provider db762d15-510c-4120-bfc4-afe76b90b657 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 10:10:59 compute-2 nova_compute[225701]: 2026-01-23 10:10:59.546 225706 DEBUG nova.compute.provider_tree [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Updating resource provider db762d15-510c-4120-bfc4-afe76b90b657 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 23 10:10:59 compute-2 nova_compute[225701]: 2026-01-23 10:10:59.577 225706 DEBUG nova.compute.resource_tracker [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:10:59 compute-2 nova_compute[225701]: 2026-01-23 10:10:59.577 225706 DEBUG oslo_concurrency.lockutils [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.045s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:10:59 compute-2 nova_compute[225701]: 2026-01-23 10:10:59.578 225706 DEBUG nova.service [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Jan 23 10:10:59 compute-2 nova_compute[225701]: 2026-01-23 10:10:59.658 225706 DEBUG nova.service [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Jan 23 10:10:59 compute-2 nova_compute[225701]: 2026-01-23 10:10:59.658 225706 DEBUG nova.servicegroup.drivers.db [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] DB_Driver: join new ServiceGroup member compute-2.ctlplane.example.com to the compute group, service = <Service: host=compute-2.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Jan 23 10:10:59 compute-2 podman[226047]: 2026-01-23 10:10:59.657742987 +0000 UTC m=+0.082924630 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:10:59 compute-2 ceph-mon[75771]: pgmap v523: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:10:59 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1862771379' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:10:59 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2360008108' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:10:59 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1031836071' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:10:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:00 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:11:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:00.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:11:00 compute-2 podman[226076]: 2026-01-23 10:11:00.626668457 +0000 UTC m=+0.053619350 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:11:00 compute-2 ceph-mon[75771]: pgmap v524: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:11:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:01.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:01 compute-2 sudo[226110]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:11:01 compute-2 sudo[226110]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:11:01 compute-2 sudo[226110]: pam_unix(sudo:session): session closed for user root
Jan 23 10:11:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:01 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa38000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:01 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa24001240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:02.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:02 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa10000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:03.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:03 compute-2 ceph-mon[75771]: pgmap v525: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.0 KiB/s rd, 1023 B/s wr, 4 op/s
Jan 23 10:11:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/101103 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:11:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:03 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa38000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:03 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:04.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:04 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa24001f20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:05 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:11:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:05.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:05 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa100016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:05 compute-2 ceph-mon[75771]: pgmap v526: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.9 KiB/s rd, 1023 B/s wr, 4 op/s
Jan 23 10:11:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:11:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:05 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa38001f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:05 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Jan 23 10:11:05 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:11:05.950293) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:11:05 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Jan 23 10:11:05 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163065950519, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 1282, "num_deletes": 251, "total_data_size": 3206730, "memory_usage": 3262960, "flush_reason": "Manual Compaction"}
Jan 23 10:11:05 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Jan 23 10:11:05 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163065967904, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 2091630, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 19379, "largest_seqno": 20656, "table_properties": {"data_size": 2086047, "index_size": 2978, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12014, "raw_average_key_size": 19, "raw_value_size": 2074807, "raw_average_value_size": 3435, "num_data_blocks": 132, "num_entries": 604, "num_filter_entries": 604, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162946, "oldest_key_time": 1769162946, "file_creation_time": 1769163065, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:11:05 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 17639 microseconds, and 7530 cpu microseconds.
Jan 23 10:11:05 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:11:05 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:11:05.967992) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 2091630 bytes OK
Jan 23 10:11:05 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:11:05.968031) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Jan 23 10:11:05 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:11:05.969658) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Jan 23 10:11:05 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:11:05.969686) EVENT_LOG_v1 {"time_micros": 1769163065969681, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:11:05 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:11:05.969708) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:11:05 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 3200649, prev total WAL file size 3200649, number of live WAL files 2.
Jan 23 10:11:05 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:11:05 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:11:05.970982) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Jan 23 10:11:05 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:11:05 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(2042KB)], [36(12MB)]
Jan 23 10:11:05 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163065971120, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 15518955, "oldest_snapshot_seqno": -1}
Jan 23 10:11:06 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 5040 keys, 13258662 bytes, temperature: kUnknown
Jan 23 10:11:06 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163066080281, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 13258662, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13223457, "index_size": 21527, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12613, "raw_key_size": 128482, "raw_average_key_size": 25, "raw_value_size": 13130267, "raw_average_value_size": 2605, "num_data_blocks": 885, "num_entries": 5040, "num_filter_entries": 5040, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769163065, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:11:06 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:11:06 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:11:06.080942) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 13258662 bytes
Jan 23 10:11:06 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:11:06.082371) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 142.0 rd, 121.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 12.8 +0.0 blob) out(12.6 +0.0 blob), read-write-amplify(13.8) write-amplify(6.3) OK, records in: 5558, records dropped: 518 output_compression: NoCompression
Jan 23 10:11:06 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:11:06.082394) EVENT_LOG_v1 {"time_micros": 1769163066082384, "job": 20, "event": "compaction_finished", "compaction_time_micros": 109279, "compaction_time_cpu_micros": 33370, "output_level": 6, "num_output_files": 1, "total_output_size": 13258662, "num_input_records": 5558, "num_output_records": 5040, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:11:06 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:11:06 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163066083278, "job": 20, "event": "table_file_deletion", "file_number": 38}
Jan 23 10:11:06 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:11:06 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163066086786, "job": 20, "event": "table_file_deletion", "file_number": 36}
Jan 23 10:11:06 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:11:05.970808) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:11:06 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:11:06.086913) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:11:06 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:11:06.086919) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:11:06 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:11:06.086921) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:11:06 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:11:06.086922) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:11:06 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:11:06.086925) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:11:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:11:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:06.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:11:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:06 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:07 compute-2 ceph-mon[75771]: pgmap v527: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.9 KiB/s rd, 1023 B/s wr, 4 op/s
Jan 23 10:11:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:11:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:07.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:11:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:07 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa24001f20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:07 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:11:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:08.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:11:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:08 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:09.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:09 compute-2 ceph-mon[75771]: pgmap v528: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:11:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:09 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:09 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:10 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:11:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:11:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:10.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:11:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:10 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa10001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:11 compute-2 ceph-mon[75771]: pgmap v529: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:11:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:11.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:11 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa24001f20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:11 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:12.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:12 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa140016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:13 compute-2 ceph-mon[75771]: pgmap v530: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:11:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:13.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:13 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa10002160 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:13 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa24001f20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:11:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:14.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:11:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:14 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30003da0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:15 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:11:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:15.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:15 compute-2 ceph-mon[75771]: pgmap v531: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:15 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa140016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:15 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa10002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:11:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:16.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:11:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:16 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa24001f20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:17 compute-2 ceph-mon[75771]: pgmap v532: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:17.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:17 compute-2 sudo[226152]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:11:17 compute-2 sudo[226152]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:11:17 compute-2 sudo[226152]: pam_unix(sudo:session): session closed for user root
Jan 23 10:11:17 compute-2 sudo[226177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Jan 23 10:11:17 compute-2 sudo[226177]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:11:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:17 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30003da0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:17 compute-2 sudo[226177]: pam_unix(sudo:session): session closed for user root
Jan 23 10:11:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:17 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa140016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:17 compute-2 sudo[226222]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:11:17 compute-2 sudo[226222]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:11:17 compute-2 sudo[226222]: pam_unix(sudo:session): session closed for user root
Jan 23 10:11:17 compute-2 sudo[226248]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:11:17 compute-2 sudo[226248]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:11:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:11:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:18.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:11:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:18 compute-2 sudo[226248]: pam_unix(sudo:session): session closed for user root
Jan 23 10:11:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:18 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa10002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:19 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:11:19 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:11:19 compute-2 ceph-mon[75771]: pgmap v533: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:19 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:11:19 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:11:19 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:11:19 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:11:19 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:11:19 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:11:19 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:11:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:19.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:19 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa24001f20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:19 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30003da0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:11:20 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:11:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:11:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:20.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:11:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:20 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:21.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:21 compute-2 sudo[226307]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:11:21 compute-2 sudo[226307]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:11:21 compute-2 sudo[226307]: pam_unix(sudo:session): session closed for user root
Jan 23 10:11:21 compute-2 ceph-mon[75771]: pgmap v534: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:21 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa10003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:21 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa24001f20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:11:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:22.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:11:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:22 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30003da0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:23 compute-2 ceph-mon[75771]: pgmap v535: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:11:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:23.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:23 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30003da0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:23 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:24 compute-2 sudo[226335]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:11:24 compute-2 sudo[226335]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:11:24 compute-2 sudo[226335]: pam_unix(sudo:session): session closed for user root
Jan 23 10:11:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:24.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:24 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa24001f20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:25 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:11:25 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:11:25 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:11:25 compute-2 ceph-mon[75771]: pgmap v536: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:25.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:25 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa10003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:25 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30003da0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:26.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:26 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:27.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:27 compute-2 ceph-mon[75771]: pgmap v537: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:27 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa24001f20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:27 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa10003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:28 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 23 10:11:28 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1318506093' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:11:28 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 23 10:11:28 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1318506093' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:11:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:28.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:28 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30003da0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:28 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/1318506093' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:11:28 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/1318506093' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:11:28 compute-2 ceph-mon[75771]: pgmap v538: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:28 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/3198864426' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:11:28 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/3198864426' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:11:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:29.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:29 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:29 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa24001f20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:30 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:11:30 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 23 10:11:30 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1327855081' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:11:30 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 23 10:11:30 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1327855081' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:11:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:30.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:30 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/1327855081' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:11:30 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/1327855081' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:11:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:30 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa10003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:30 compute-2 podman[226367]: 2026-01-23 10:11:30.654853189 +0000 UTC m=+0.080459019 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 10:11:30 compute-2 podman[226394]: 2026-01-23 10:11:30.728543811 +0000 UTC m=+0.050151114 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 23 10:11:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:31.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:31 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30003da0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:31 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:32 compute-2 ceph-mon[75771]: pgmap v539: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:11:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:32.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:11:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:32 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:33 compute-2 ceph-mon[75771]: pgmap v540: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:11:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:33.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:33 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa24001f20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:33 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa24001f20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:11:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:34.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:11:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:34 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:35 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:11:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:35.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:35 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa10003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:35 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30003da0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:36 compute-2 ceph-mon[75771]: pgmap v541: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:36 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:11:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:36.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:36 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa24001f20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:37 compute-2 ceph-mon[75771]: pgmap v542: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:37.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:37 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:37 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa10003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:11:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:38.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:11:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:38 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30003da0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:39.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:39 compute-2 ceph-mon[75771]: pgmap v543: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:39 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa24001f20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:39 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:40 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:11:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:40.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:40 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c0013a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:11:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:41.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:11:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:41 compute-2 ceph-mon[75771]: pgmap v544: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:41 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30003da0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:41 compute-2 sudo[226426]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:11:41 compute-2 sudo[226426]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:11:41 compute-2 sudo[226426]: pam_unix(sudo:session): session closed for user root
Jan 23 10:11:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:41 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa24003bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:11:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:42.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:11:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:42 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:43 compute-2 ceph-mon[75771]: pgmap v545: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:11:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:43.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:43 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:43 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30003da0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:44.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:44 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa24003bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:45 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:11:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:45.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:45 compute-2 ceph-mon[75771]: pgmap v546: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:45 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:45 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:46.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:46 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30003da0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:47.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:47 compute-2 ceph-mon[75771]: pgmap v547: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:47 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa24003bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:47 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:11:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:48.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:11:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:48 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:48 compute-2 ceph-mon[75771]: pgmap v548: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:49.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:49 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30003da0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:49 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa24003bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:11:50 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:11:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:50.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:50 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:50 compute-2 nova_compute[225701]: 2026-01-23 10:11:50.661 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:11:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:50 compute-2 nova_compute[225701]: 2026-01-23 10:11:50.939 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:11:51 compute-2 ceph-mon[75771]: pgmap v549: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:51.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:51 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c003340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:51 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30003da0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:11:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:52.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:11:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:52 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa24003bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:53.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:53 compute-2 ceph-mon[75771]: pgmap v550: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:11:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:53 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:53 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c003340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:54.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:54 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30003da0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:55 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:11:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:55.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:11:55.473 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:11:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:11:55.474 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:11:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:11:55.475 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:11:55 compute-2 ceph-mon[75771]: pgmap v551: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:55 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30003da0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:55 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:11:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:56.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:11:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:56 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c003340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:56 compute-2 ceph-mon[75771]: pgmap v552: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:56 compute-2 nova_compute[225701]: 2026-01-23 10:11:56.785 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:11:56 compute-2 nova_compute[225701]: 2026-01-23 10:11:56.785 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:11:56 compute-2 nova_compute[225701]: 2026-01-23 10:11:56.786 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:11:56 compute-2 nova_compute[225701]: 2026-01-23 10:11:56.786 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:11:56 compute-2 nova_compute[225701]: 2026-01-23 10:11:56.815 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:11:56 compute-2 nova_compute[225701]: 2026-01-23 10:11:56.815 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:11:56 compute-2 nova_compute[225701]: 2026-01-23 10:11:56.816 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:11:56 compute-2 nova_compute[225701]: 2026-01-23 10:11:56.816 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:11:56 compute-2 nova_compute[225701]: 2026-01-23 10:11:56.816 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:11:56 compute-2 nova_compute[225701]: 2026-01-23 10:11:56.816 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:11:56 compute-2 nova_compute[225701]: 2026-01-23 10:11:56.817 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:11:56 compute-2 nova_compute[225701]: 2026-01-23 10:11:56.817 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:11:56 compute-2 nova_compute[225701]: 2026-01-23 10:11:56.817 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:11:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:57 compute-2 nova_compute[225701]: 2026-01-23 10:11:57.121 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:11:57 compute-2 nova_compute[225701]: 2026-01-23 10:11:57.121 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:11:57 compute-2 nova_compute[225701]: 2026-01-23 10:11:57.122 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:11:57 compute-2 nova_compute[225701]: 2026-01-23 10:11:57.122 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:11:57 compute-2 nova_compute[225701]: 2026-01-23 10:11:57.122 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:11:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:11:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:57.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:11:57 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:11:57 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/354272967' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:11:57 compute-2 nova_compute[225701]: 2026-01-23 10:11:57.628 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:11:57 compute-2 nova_compute[225701]: 2026-01-23 10:11:57.790 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:11:57 compute-2 nova_compute[225701]: 2026-01-23 10:11:57.791 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5238MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:11:57 compute-2 nova_compute[225701]: 2026-01-23 10:11:57.791 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:11:57 compute-2 nova_compute[225701]: 2026-01-23 10:11:57.791 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:11:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:57 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa24003bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:57 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30003da0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:57 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1555214160' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:11:57 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/4280706252' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:11:57 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/354272967' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:11:57 compute-2 nova_compute[225701]: 2026-01-23 10:11:57.958 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:11:57 compute-2 nova_compute[225701]: 2026-01-23 10:11:57.958 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:11:57 compute-2 nova_compute[225701]: 2026-01-23 10:11:57.982 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:11:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:11:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:58.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:11:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:58 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:58 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3890008546' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:11:58 compute-2 ceph-mon[75771]: pgmap v553: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:58 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3720768701' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:11:58 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:11:58 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4085484174' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:11:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:58 compute-2 nova_compute[225701]: 2026-01-23 10:11:58.935 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.952s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:11:58 compute-2 nova_compute[225701]: 2026-01-23 10:11:58.941 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:11:58 compute-2 nova_compute[225701]: 2026-01-23 10:11:58.957 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:11:58 compute-2 nova_compute[225701]: 2026-01-23 10:11:58.958 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:11:58 compute-2 nova_compute[225701]: 2026-01-23 10:11:58.959 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:11:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:11:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:11:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:59.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:59 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:11:59 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:11:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:11:59 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/4085484174' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:12:00 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:12:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:00.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30003da0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:00 compute-2 ceph-mon[75771]: pgmap v554: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:12:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:01.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:01 compute-2 podman[226516]: 2026-01-23 10:12:01.63064889 +0000 UTC m=+0.057607078 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 23 10:12:01 compute-2 podman[226515]: 2026-01-23 10:12:01.71448534 +0000 UTC m=+0.143934199 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 10:12:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:01 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:01 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:01 compute-2 sudo[226560]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:12:01 compute-2 sudo[226560]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:12:01 compute-2 sudo[226560]: pam_unix(sudo:session): session closed for user root
Jan 23 10:12:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:02 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:12:02 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 5807 writes, 24K keys, 5807 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s
                                           Cumulative WAL: 5807 writes, 987 syncs, 5.88 writes per sync, written: 0.02 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 440 writes, 717 keys, 440 commit groups, 1.0 writes per commit group, ingest: 0.23 MB, 0.00 MB/s
                                           Interval WAL: 440 writes, 204 syncs, 2.16 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb09b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb09b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb09b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Jan 23 10:12:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:02.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:02 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa24003bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:03.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:03 compute-2 ceph-mon[75771]: pgmap v555: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:12:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:03 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30003da0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:03 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:12:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:04.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:12:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:04 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:05 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:12:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:05.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:05 compute-2 ceph-mon[75771]: pgmap v556: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:12:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:12:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:05 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa24003bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:05 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30003da0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:06.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:06 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:07.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:07 compute-2 ceph-mon[75771]: pgmap v557: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:12:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:07 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:07 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa24003bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:12:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:08.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:12:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:08 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30003da0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:09 compute-2 ceph-mon[75771]: pgmap v558: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:12:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:12:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:09.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:12:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:09 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30003da0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:09 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:10 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:12:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:10.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:10 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa10001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:11.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:11 compute-2 ceph-mon[75771]: pgmap v559: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:12:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:11 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa380012b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:11 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003ca0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:12.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:12 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:13.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:13 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:13 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa380012b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:14 compute-2 ceph-mon[75771]: pgmap v560: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:12:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:12:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:14.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:12:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:14 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:15 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:12:15 compute-2 ceph-mon[75771]: pgmap v561: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:12:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:15.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:15 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa10001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:15 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:12:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:16.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:12:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:16 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa380012b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:17.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:17 compute-2 ceph-mon[75771]: pgmap v562: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:12:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:17 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003ce0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:17 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa10001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:18.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:18 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:12:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:19.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:12:19 compute-2 ceph-mon[75771]: pgmap v563: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:12:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:19 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:19 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:20 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:12:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:20.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:12:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:20 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa10001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:21.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:21 compute-2 ceph-mon[75771]: pgmap v564: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:12:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:21 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa10001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:21 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:22 compute-2 sudo[226608]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:12:22 compute-2 sudo[226608]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:12:22 compute-2 sudo[226608]: pam_unix(sudo:session): session closed for user root
Jan 23 10:12:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:22.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:22 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa380089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:23.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:23 compute-2 ceph-mon[75771]: pgmap v565: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:12:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:23 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa380089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:23 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:24 compute-2 sudo[226636]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:12:24 compute-2 sudo[226636]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:12:24 compute-2 sudo[226636]: pam_unix(sudo:session): session closed for user root
Jan 23 10:12:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:24 compute-2 sudo[226661]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:12:24 compute-2 sudo[226661]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:12:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:24.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:24 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003d20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:24 compute-2 sudo[226661]: pam_unix(sudo:session): session closed for user root
Jan 23 10:12:25 compute-2 ceph-mon[75771]: pgmap v566: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:12:25 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:12:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:25.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:25 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003d20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:25 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003d20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:12:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:26.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:12:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:26 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003d20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:27 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:12:27 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:12:27 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:12:27 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:12:27 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:12:27 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:12:27 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:12:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:27.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/101227 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:12:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:27 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa08000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:27 compute-2 ceph-mon[75771]: pgmap v567: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:12:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:27 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003d20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:28.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:28 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003d20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/101229 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:12:29 compute-2 ceph-mon[75771]: pgmap v568: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:12:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:29.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:29 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003d20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:29 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa380089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:30 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:12:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:30.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:30 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:31 compute-2 ceph-mon[75771]: pgmap v569: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:12:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:31.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:31 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa080016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:31 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:32.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:32 compute-2 sudo[226728]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:12:32 compute-2 sudo[226728]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:12:32 compute-2 sudo[226728]: pam_unix(sudo:session): session closed for user root
Jan 23 10:12:32 compute-2 podman[226753]: 2026-01-23 10:12:32.597570445 +0000 UTC m=+0.066628851 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Jan 23 10:12:32 compute-2 podman[226752]: 2026-01-23 10:12:32.626801681 +0000 UTC m=+0.101518812 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 23 10:12:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:32 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa38009ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:32 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:12:32 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:12:32 compute-2 ceph-mon[75771]: pgmap v570: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:12:33 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "version", "format": "json"} v 0)
Jan 23 10:12:33 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3457274779' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Jan 23 10:12:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:33.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:33 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c004460 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:33 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa08001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:12:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:34.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:12:34 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/3457274779' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Jan 23 10:12:34 compute-2 ceph-mon[75771]: from='client.24496 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Jan 23 10:12:34 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/1987316555' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Jan 23 10:12:34 compute-2 ceph-mon[75771]: from='client.14898 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Jan 23 10:12:34 compute-2 ceph-mon[75771]: from='client.14898 -' entity='client.openstack' cmd=[{"prefix": "nfs cluster info", "cluster_id": "cephfs", "format": "json"}]: dispatch
Jan 23 10:12:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:34 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:35 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:12:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:12:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:35.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:12:35 compute-2 ceph-mon[75771]: pgmap v571: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Jan 23 10:12:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:12:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:35 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa38009ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:35 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c004480 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:36.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:36 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa08001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:12:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:37.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:12:37 compute-2 ceph-mon[75771]: pgmap v572: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 69 KiB/s rd, 85 B/s wr, 114 op/s
Jan 23 10:12:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:37 : epoch 6973492c : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:12:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:37 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:37 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa38009ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:38.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:38 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c0044a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:39.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:39 compute-2 ceph-mon[75771]: pgmap v573: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 69 KiB/s rd, 85 B/s wr, 114 op/s
Jan 23 10:12:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:39 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa08001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:39 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:40 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:12:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:40.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:40 : epoch 6973492c : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:12:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:40 : epoch 6973492c : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:12:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:40 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa38009ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:41.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:41 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c0044c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:41 compute-2 ceph-mon[75771]: pgmap v574: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 69 KiB/s rd, 85 B/s wr, 114 op/s
Jan 23 10:12:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:41 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa080032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:42 compute-2 sudo[226803]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:12:42 compute-2 sudo[226803]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:12:42 compute-2 sudo[226803]: pam_unix(sudo:session): session closed for user root
Jan 23 10:12:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:42.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:42 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003ee0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:42 compute-2 ceph-mon[75771]: pgmap v575: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 81 KiB/s rd, 938 B/s wr, 134 op/s
Jan 23 10:12:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:43.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:43 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa38009ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:43 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c0044c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:44 : epoch 6973492c : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:12:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:44.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:44 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa080032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:45 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:12:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:45.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:45 compute-2 ceph-mon[75771]: pgmap v576: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 81 KiB/s rd, 938 B/s wr, 134 op/s
Jan 23 10:12:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:45 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:45 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa38009ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:46 : epoch 6973492c : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:12:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:46.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:46 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c0044c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:12:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:47.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:12:47 compute-2 ceph-mon[75771]: pgmap v577: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 109 KiB/s rd, 1023 B/s wr, 182 op/s
Jan 23 10:12:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:47 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa08004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:47 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003f20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:48.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:48 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa38009ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:49 : epoch 6973492c : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:12:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:49 : epoch 6973492c : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:12:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:49.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/101249 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:12:49 compute-2 ceph-mon[75771]: pgmap v578: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 40 KiB/s rd, 938 B/s wr, 67 op/s
Jan 23 10:12:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/3459287412' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:12:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/3459287412' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:12:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:49 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c0044c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:49 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa08004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:50 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:12:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:12:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:50.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:12:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:12:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:50 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003f40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:50 : epoch 6973492c : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:12:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:12:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:51.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:12:51 compute-2 ceph-mon[75771]: pgmap v579: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 40 KiB/s rd, 938 B/s wr, 67 op/s
Jan 23 10:12:51 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/272537903' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Jan 23 10:12:51 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/3119665311' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Jan 23 10:12:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:51 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa38009ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:51 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c0044c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:12:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:52.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:12:52 compute-2 ceph-mon[75771]: from='client.14940 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Jan 23 10:12:52 compute-2 ceph-mon[75771]: from='client.24536 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Jan 23 10:12:52 compute-2 ceph-mon[75771]: from='client.24536 -' entity='client.openstack' cmd=[{"prefix": "nfs cluster info", "cluster_id": "cephfs", "format": "json"}]: dispatch
Jan 23 10:12:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:52 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa08004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/101253 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:12:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:53.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:53 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003f60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:53 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa38009ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:54 compute-2 ceph-mon[75771]: pgmap v580: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 42 KiB/s rd, 1.7 KiB/s wr, 70 op/s
Jan 23 10:12:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:54.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:54 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c0044e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:55 compute-2 ceph-mon[75771]: pgmap v581: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 31 KiB/s rd, 852 B/s wr, 50 op/s
Jan 23 10:12:55 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:12:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:12:55.475 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:12:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:12:55.475 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:12:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:12:55.476 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:12:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:55.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:55 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa08004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:55 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa14003f80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:12:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:56.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:12:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:56 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa38009ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:57 compute-2 ceph-mon[75771]: pgmap v582: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 31 KiB/s rd, 852 B/s wr, 50 op/s
Jan 23 10:12:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:12:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:57.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:12:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:57 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c004500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:57 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa08004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:12:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:58.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:12:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:58 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa24000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:58 compute-2 nova_compute[225701]: 2026-01-23 10:12:58.952 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:12:58 compute-2 nova_compute[225701]: 2026-01-23 10:12:58.952 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:12:58 compute-2 nova_compute[225701]: 2026-01-23 10:12:58.970 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:12:58 compute-2 nova_compute[225701]: 2026-01-23 10:12:58.971 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:12:58 compute-2 nova_compute[225701]: 2026-01-23 10:12:58.971 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:12:58 compute-2 nova_compute[225701]: 2026-01-23 10:12:58.983 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:12:58 compute-2 nova_compute[225701]: 2026-01-23 10:12:58.983 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:12:58 compute-2 nova_compute[225701]: 2026-01-23 10:12:58.983 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:12:58 compute-2 nova_compute[225701]: 2026-01-23 10:12:58.983 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:12:58 compute-2 nova_compute[225701]: 2026-01-23 10:12:58.984 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:12:58 compute-2 nova_compute[225701]: 2026-01-23 10:12:58.984 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:12:58 compute-2 nova_compute[225701]: 2026-01-23 10:12:58.984 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:12:58 compute-2 nova_compute[225701]: 2026-01-23 10:12:58.984 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:12:58 compute-2 nova_compute[225701]: 2026-01-23 10:12:58.985 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:12:59 compute-2 nova_compute[225701]: 2026-01-23 10:12:59.007 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:12:59 compute-2 nova_compute[225701]: 2026-01-23 10:12:59.008 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:12:59 compute-2 nova_compute[225701]: 2026-01-23 10:12:59.008 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:12:59 compute-2 nova_compute[225701]: 2026-01-23 10:12:59.008 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:12:59 compute-2 nova_compute[225701]: 2026-01-23 10:12:59.008 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:12:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:12:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:12:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:59.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:59 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa38009ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:12:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:12:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:12:59 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c004500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:59 compute-2 ceph-mon[75771]: pgmap v583: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 767 B/s wr, 2 op/s
Jan 23 10:12:59 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/811842954' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:12:59 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/190810241' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:13:00 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:13:00 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:13:00 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/507459087' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:13:00 compute-2 nova_compute[225701]: 2026-01-23 10:13:00.330 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.322s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:13:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:00.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:00 compute-2 nova_compute[225701]: 2026-01-23 10:13:00.514 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:13:00 compute-2 nova_compute[225701]: 2026-01-23 10:13:00.515 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5219MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:13:00 compute-2 nova_compute[225701]: 2026-01-23 10:13:00.516 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:13:00 compute-2 nova_compute[225701]: 2026-01-23 10:13:00.516 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:13:00 compute-2 nova_compute[225701]: 2026-01-23 10:13:00.576 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:13:00 compute-2 nova_compute[225701]: 2026-01-23 10:13:00.576 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:13:00 compute-2 nova_compute[225701]: 2026-01-23 10:13:00.597 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:13:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:13:00 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa10001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:01 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:13:01 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/814723742' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:13:01 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2632395150' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:13:01 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/346920199' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:13:01 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/507459087' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:13:01 compute-2 ceph-mon[75771]: pgmap v584: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 767 B/s wr, 2 op/s
Jan 23 10:13:01 compute-2 nova_compute[225701]: 2026-01-23 10:13:01.091 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:13:01 compute-2 nova_compute[225701]: 2026-01-23 10:13:01.097 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:13:01 compute-2 nova_compute[225701]: 2026-01-23 10:13:01.118 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:13:01 compute-2 nova_compute[225701]: 2026-01-23 10:13:01.119 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:13:01 compute-2 nova_compute[225701]: 2026-01-23 10:13:01.120 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.604s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:13:01 compute-2 anacron[2783]: Job `cron.monthly' started
Jan 23 10:13:01 compute-2 anacron[2783]: Job `cron.monthly' terminated
Jan 23 10:13:01 compute-2 anacron[2783]: Normal exit (3 jobs run)
Jan 23 10:13:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:01.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:13:01 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa24000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:13:01 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa38009ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:02 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/814723742' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:13:02 compute-2 sudo[226896]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:13:02 compute-2 sudo[226896]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:13:02 compute-2 sudo[226896]: pam_unix(sudo:session): session closed for user root
Jan 23 10:13:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:02.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:13:02 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c004500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:03 compute-2 ceph-mon[75771]: pgmap v585: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 767 B/s wr, 3 op/s
Jan 23 10:13:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:03.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:03 compute-2 podman[226923]: 2026-01-23 10:13:03.6244822 +0000 UTC m=+0.048575369 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 23 10:13:03 compute-2 podman[226922]: 2026-01-23 10:13:03.65547304 +0000 UTC m=+0.080344028 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller)
Jan 23 10:13:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:13:03 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa10001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:13:03 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa240024b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:04.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:13:04 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa38009ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:05 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:13:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:05.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:05 compute-2 ceph-mon[75771]: pgmap v586: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:13:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:13:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:13:05 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c004500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:13:05 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c004500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:06.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:13:06 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa10001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:07.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:07 compute-2 ceph-mon[75771]: pgmap v587: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:13:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:13:07 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30002040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:13:07 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa240024b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:08.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:13:08 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa2c004500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:09.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:09 compute-2 ceph-mon[75771]: pgmap v588: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:13:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:13:09 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa10002a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:13:09 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa30002040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:10 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:13:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:13:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:10.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:13:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:13:10 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa240031c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:10 compute-2 ceph-mon[75771]: pgmap v589: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:13:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:13:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:11.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:13:11 compute-2 kernel: ganesha.nfsd[226843]: segfault at 50 ip 00007faabb96232e sp 00007faa5affc210 error 4 in libntirpc.so.5.8[7faabb947000+2c000] likely on CPU 2 (core 0, socket 2)
Jan 23 10:13:11 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 10:13:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:13:11 : epoch 6973492c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faa240031c0 fd 39 proxy ignored for local
Jan 23 10:13:11 compute-2 systemd[1]: Started Process Core Dump (PID 226981/UID 0).
Jan 23 10:13:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:13:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:12.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:13:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:13 compute-2 systemd-coredump[226982]: Process 225593 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 61:
                                                    #0  0x00007faabb96232e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Jan 23 10:13:13 compute-2 systemd[1]: systemd-coredump@9-226981-0.service: Deactivated successfully.
Jan 23 10:13:13 compute-2 systemd[1]: systemd-coredump@9-226981-0.service: Consumed 1.325s CPU time.
Jan 23 10:13:13 compute-2 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 10:13:13 compute-2 podman[226990]: 2026-01-23 10:13:13.33695652 +0000 UTC m=+0.026349260 container died 7bf0ac2a3b0db2b226ff7b02cceefaa8070d70f7dbf7b4bd95e30e956430f2e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 10:13:13 compute-2 systemd[1]: var-lib-containers-storage-overlay-7a7d889ca7155de289c02ef8a64720d2cb4293985fa9132c6bb9ef15a832b68d-merged.mount: Deactivated successfully.
Jan 23 10:13:13 compute-2 podman[226990]: 2026-01-23 10:13:13.373312827 +0000 UTC m=+0.062705567 container remove 7bf0ac2a3b0db2b226ff7b02cceefaa8070d70f7dbf7b4bd95e30e956430f2e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 10:13:13 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Main process exited, code=exited, status=139/n/a
Jan 23 10:13:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:13 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 10:13:13 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.753s CPU time.
Jan 23 10:13:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:13.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:13 compute-2 ceph-mon[75771]: pgmap v590: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:13:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:14.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:15 compute-2 ceph-mon[75771]: pgmap v591: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:13:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:13:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:15.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:13:15 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:13:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:16.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:17 compute-2 ceph-mon[75771]: pgmap v592: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:13:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:17.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/101317 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:13:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:13:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:18.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:13:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:19 compute-2 ceph-mon[75771]: pgmap v593: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:13:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:13:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:19.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:13:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:13:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:20.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:20 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:13:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:21.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:21 compute-2 ceph-mon[75771]: pgmap v594: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:13:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:22 compute-2 sudo[227039]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:13:22 compute-2 sudo[227039]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:13:22 compute-2 sudo[227039]: pam_unix(sudo:session): session closed for user root
Jan 23 10:13:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:13:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:22.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:13:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:23 compute-2 ceph-mon[75771]: pgmap v595: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:13:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:23 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Scheduled restart job, restart counter is at 10.
Jan 23 10:13:23 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:13:23 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.753s CPU time.
Jan 23 10:13:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:23.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:23 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 10:13:23 compute-2 podman[227115]: 2026-01-23 10:13:23.883208589 +0000 UTC m=+0.088652217 container create d31d9134b93b4b3750666f066eb8969836306147460876eda839ac973c41f97e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Jan 23 10:13:23 compute-2 podman[227115]: 2026-01-23 10:13:23.818899704 +0000 UTC m=+0.024343362 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 10:13:23 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5102798cab23695af3c9da9dd9606613e6b3ea4e6553d09e76b34a3d22c39e89/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 10:13:23 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5102798cab23695af3c9da9dd9606613e6b3ea4e6553d09e76b34a3d22c39e89/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 10:13:23 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5102798cab23695af3c9da9dd9606613e6b3ea4e6553d09e76b34a3d22c39e89/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:13:23 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5102798cab23695af3c9da9dd9606613e6b3ea4e6553d09e76b34a3d22c39e89/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.tykohi-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:13:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:24 compute-2 podman[227115]: 2026-01-23 10:13:24.049547275 +0000 UTC m=+0.254991003 container init d31d9134b93b4b3750666f066eb8969836306147460876eda839ac973c41f97e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Jan 23 10:13:24 compute-2 podman[227115]: 2026-01-23 10:13:24.054934044 +0000 UTC m=+0.260377712 container start d31d9134b93b4b3750666f066eb8969836306147460876eda839ac973c41f97e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325)
Jan 23 10:13:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:24 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 10:13:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:24 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 10:13:24 compute-2 bash[227115]: d31d9134b93b4b3750666f066eb8969836306147460876eda839ac973c41f97e
Jan 23 10:13:24 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:13:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:24 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 10:13:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:24 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 10:13:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:24 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 10:13:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:24 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 10:13:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:24 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 10:13:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:24 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:13:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:24.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:25 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:13:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:13:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:25.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:13:25 compute-2 ceph-mon[75771]: pgmap v596: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:13:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:26.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:27 compute-2 ceph-mon[75771]: pgmap v597: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:13:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:13:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:27.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:13:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:28.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:29 compute-2 ceph-mon[75771]: pgmap v598: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:13:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:29.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:30.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:30 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:13:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:30 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:13:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:30 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:13:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:31.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:31 compute-2 ceph-mon[75771]: pgmap v599: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:13:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:32.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:32 compute-2 sudo[227183]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:13:32 compute-2 sudo[227183]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:13:32 compute-2 sudo[227183]: pam_unix(sudo:session): session closed for user root
Jan 23 10:13:32 compute-2 sudo[227208]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:13:32 compute-2 sudo[227208]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:13:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:33 compute-2 ceph-mon[75771]: pgmap v600: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 852 B/s wr, 3 op/s
Jan 23 10:13:33 compute-2 sudo[227208]: pam_unix(sudo:session): session closed for user root
Jan 23 10:13:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:33.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:34 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:13:34 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:13:34 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:13:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:34.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:34 compute-2 podman[227267]: 2026-01-23 10:13:34.643523221 +0000 UTC m=+0.067976293 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 10:13:34 compute-2 podman[227266]: 2026-01-23 10:13:34.680126944 +0000 UTC m=+0.104995255 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:13:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:13:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:35.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:13:35 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:13:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:13:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:13:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:13:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:13:35 compute-2 ceph-mon[75771]: pgmap v601: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 852 B/s wr, 2 op/s
Jan 23 10:13:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:13:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:36.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:13:36 compute-2 ceph-mon[75771]: pgmap v602: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee8000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:37.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:37 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed4000da0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:37 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:38.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:38 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:39.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:39 compute-2 sudo[227330]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:13:39 compute-2 sudo[227330]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:13:39 compute-2 sudo[227330]: pam_unix(sudo:session): session closed for user root
Jan 23 10:13:39 compute-2 ceph-mon[75771]: pgmap v603: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:13:39 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:13:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/101339 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:13:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:39 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4eec001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:39 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed4001ab0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:40.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:40 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:13:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:40 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:41 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:13:41 compute-2 ceph-mon[75771]: pgmap v604: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:13:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:41.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:41 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:42 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4eec0023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:42 compute-2 sudo[227359]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:13:42 compute-2 sudo[227359]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:13:42 compute-2 sudo[227359]: pam_unix(sudo:session): session closed for user root
Jan 23 10:13:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:42.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:42 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed4001ab0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:13:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:43.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:13:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:43 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:44 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:44.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:44 compute-2 ceph-mon[75771]: pgmap v605: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:13:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:44 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4eec0023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:45.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:45 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:13:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:45 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40027c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:45 compute-2 ceph-mon[75771]: pgmap v606: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 170 B/s wr, 0 op/s
Jan 23 10:13:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:46 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:46.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:46 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:47 compute-2 ceph-mon[75771]: pgmap v607: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 170 B/s wr, 0 op/s
Jan 23 10:13:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:47.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:47 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4eec0023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:48 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40027c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:48.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:48 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:49 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:13:49.241 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:13:49 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:13:49.244 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:13:49 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:13:49.246 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:13:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:49.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:49 compute-2 ceph-mon[75771]: pgmap v608: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:13:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/239642082' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:13:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/239642082' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:13:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:49 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:50 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4eec0034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:50.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:50 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:13:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:50 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40027c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:13:50 compute-2 ceph-mon[75771]: pgmap v609: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:13:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:51.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:51 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:52 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:52.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:52 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4eec0034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:13:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:53.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:13:53 compute-2 ceph-mon[75771]: pgmap v610: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:13:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:53 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40027c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:54 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:54.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:54 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:13:55.476 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:13:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:13:55.476 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:13:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:13:55.477 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:13:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:13:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:55.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:13:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:55 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4eec0034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:55 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:13:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:56 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40027c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:56 compute-2 ceph-mon[75771]: pgmap v611: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:13:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:56.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:56 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:13:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:57.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:13:57 compute-2 ceph-mon[75771]: pgmap v612: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:13:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:57 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:58 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4eec0034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:58.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:58 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40027c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:59 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/774375893' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:13:59 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/4074051618' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:13:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:13:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:13:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:13:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:59.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:13:59 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:13:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:00 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:14:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:00.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:14:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:00 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4eec0034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:01 compute-2 nova_compute[225701]: 2026-01-23 10:14:01.121 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:14:01 compute-2 nova_compute[225701]: 2026-01-23 10:14:01.122 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:14:01 compute-2 nova_compute[225701]: 2026-01-23 10:14:01.122 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:14:01 compute-2 nova_compute[225701]: 2026-01-23 10:14:01.123 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:14:01 compute-2 nova_compute[225701]: 2026-01-23 10:14:01.143 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:14:01 compute-2 nova_compute[225701]: 2026-01-23 10:14:01.144 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:14:01 compute-2 nova_compute[225701]: 2026-01-23 10:14:01.144 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:14:01 compute-2 nova_compute[225701]: 2026-01-23 10:14:01.144 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:14:01 compute-2 nova_compute[225701]: 2026-01-23 10:14:01.145 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:14:01 compute-2 nova_compute[225701]: 2026-01-23 10:14:01.145 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:14:01 compute-2 nova_compute[225701]: 2026-01-23 10:14:01.145 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:14:01 compute-2 nova_compute[225701]: 2026-01-23 10:14:01.145 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:14:01 compute-2 nova_compute[225701]: 2026-01-23 10:14:01.145 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:14:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:01 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:14:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:01.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:01 compute-2 nova_compute[225701]: 2026-01-23 10:14:01.628 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:14:01 compute-2 nova_compute[225701]: 2026-01-23 10:14:01.629 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:14:01 compute-2 nova_compute[225701]: 2026-01-23 10:14:01.629 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:14:01 compute-2 nova_compute[225701]: 2026-01-23 10:14:01.629 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:14:01 compute-2 nova_compute[225701]: 2026-01-23 10:14:01.629 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:14:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:01 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40027c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:02 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:02 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:14:02 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2596105723' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:14:02 compute-2 nova_compute[225701]: 2026-01-23 10:14:02.204 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.574s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:14:02 compute-2 ceph-mon[75771]: pgmap v613: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:02 compute-2 nova_compute[225701]: 2026-01-23 10:14:02.368 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:14:02 compute-2 nova_compute[225701]: 2026-01-23 10:14:02.370 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5245MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:14:02 compute-2 nova_compute[225701]: 2026-01-23 10:14:02.370 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:14:02 compute-2 nova_compute[225701]: 2026-01-23 10:14:02.370 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:14:02 compute-2 nova_compute[225701]: 2026-01-23 10:14:02.436 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:14:02 compute-2 nova_compute[225701]: 2026-01-23 10:14:02.437 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:14:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:02 compute-2 sudo[227426]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:14:02 compute-2 sudo[227426]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:14:02 compute-2 sudo[227426]: pam_unix(sudo:session): session closed for user root
Jan 23 10:14:02 compute-2 nova_compute[225701]: 2026-01-23 10:14:02.454 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:14:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:02.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:02 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:02 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:14:02 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3988597975' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:14:02 compute-2 nova_compute[225701]: 2026-01-23 10:14:02.894 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:14:02 compute-2 nova_compute[225701]: 2026-01-23 10:14:02.899 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:14:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:03 compute-2 nova_compute[225701]: 2026-01-23 10:14:03.133 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:14:03 compute-2 nova_compute[225701]: 2026-01-23 10:14:03.135 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:14:03 compute-2 nova_compute[225701]: 2026-01-23 10:14:03.135 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.765s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:14:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:03.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:03 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4eec0034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:04 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40027c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:04 compute-2 ceph-mon[75771]: pgmap v614: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:04 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2203974394' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:14:04 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2596105723' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:14:04 compute-2 ceph-mon[75771]: pgmap v615: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:14:04 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/515581582' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:14:04 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3988597975' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:14:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:04.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:04 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:05.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:05 compute-2 podman[227476]: 2026-01-23 10:14:05.638521166 +0000 UTC m=+0.058256351 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 10:14:05 compute-2 podman[227475]: 2026-01-23 10:14:05.667497516 +0000 UTC m=+0.089985737 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 23 10:14:05 compute-2 ceph-mon[75771]: pgmap v616: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:05 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:06 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4eec0034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:06 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:14:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:06.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:06 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40027c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:07 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:14:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:07.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:07 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40027c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:08 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:08 compute-2 ceph-mon[75771]: pgmap v617: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:14:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:08.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:08 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee8000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:09 compute-2 ceph-mon[75771]: pgmap v618: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:09.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:09 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:10 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8000d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:10.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:10 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8000d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:11 compute-2 ceph-mon[75771]: pgmap v619: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:11 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:14:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:11.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:11 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee8001d70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:12 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:14:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:12.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:14:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:12 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:13.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:13 compute-2 ceph-mon[75771]: pgmap v620: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:14:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:13 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8001cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:14 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:14.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:14 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee8001d70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:15 compute-2 ceph-mon[75771]: pgmap v621: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:15.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:15 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:16 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8001cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:16.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:16 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:14:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:16 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:17 compute-2 ceph-mon[75771]: pgmap v622: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:14:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:17.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:17 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee8001d70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:18 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:18.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:18 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8001cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:18 compute-2 ceph-mon[75771]: pgmap v623: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:19.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:19 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:20 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee8001d70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:14:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:14:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:20.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:14:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:20 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:21 compute-2 ceph-mon[75771]: pgmap v624: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:21.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:21 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:14:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:21 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec80030a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:22 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:22 compute-2 sudo[227538]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:14:22 compute-2 sudo[227538]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:14:22 compute-2 sudo[227538]: pam_unix(sudo:session): session closed for user root
Jan 23 10:14:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:22.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:22 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee80095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:14:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:23.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:14:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:23 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:24 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec80030a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:24 compute-2 ceph-mon[75771]: pgmap v625: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:14:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:24.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:24 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:14:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:25.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:14:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:25 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee80095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:26 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:26.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:26 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:27.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:27 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:28 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee80095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:28 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:14:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:14:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:28.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:14:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:28 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:29 compute-2 ceph-mon[75771]: pgmap v626: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:29.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:29 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:30 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:30.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:30 compute-2 ceph-mon[75771]: pgmap v627: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:14:30 compute-2 ceph-mon[75771]: pgmap v628: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:30 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:31.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:31 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:32 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:14:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:32.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:14:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:32 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:33 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:14:33 compute-2 ceph-mon[75771]: pgmap v629: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000020s ======
Jan 23 10:14:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:33.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Jan 23 10:14:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:33 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:34 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ebc000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:34 compute-2 ceph-mon[75771]: pgmap v630: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:14:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:34.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:34 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:35 compute-2 ceph-mon[75771]: pgmap v631: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:14:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:35.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:35 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee80095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:36.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:36 compute-2 podman[227579]: 2026-01-23 10:14:36.661860123 +0000 UTC m=+0.076925576 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 23 10:14:36 compute-2 podman[227578]: 2026-01-23 10:14:36.70055349 +0000 UTC m=+0.113887697 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 10:14:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ebc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:37 compute-2 ceph-mon[75771]: pgmap v632: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:14:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000021s ======
Jan 23 10:14:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:37.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000021s
Jan 23 10:14:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:37 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:38 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee80095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:38 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:14:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:38.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:38 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:39.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:39 compute-2 ceph-mon[75771]: pgmap v633: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:39 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:39 compute-2 sudo[227626]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:14:39 compute-2 sudo[227626]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:14:39 compute-2 sudo[227626]: pam_unix(sudo:session): session closed for user root
Jan 23 10:14:40 compute-2 sudo[227652]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 23 10:14:40 compute-2 sudo[227652]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:14:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:40 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:40 compute-2 podman[227751]: 2026-01-23 10:14:40.526096597 +0000 UTC m=+0.053649054 container exec 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 23 10:14:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:40.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:40 compute-2 podman[227751]: 2026-01-23 10:14:40.632947813 +0000 UTC m=+0.160500250 container exec_died 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 10:14:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:40 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee80095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:40 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 10:14:40 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 10:14:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:41 compute-2 podman[227851]: 2026-01-23 10:14:41.000509488 +0000 UTC m=+0.058273083 container exec 610246eaa4f552466bc8eee8cc806d516dfcdb85055d982f77b2276e24631ff0 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 10:14:41 compute-2 podman[227851]: 2026-01-23 10:14:41.014165866 +0000 UTC m=+0.071929461 container exec_died 610246eaa4f552466bc8eee8cc806d516dfcdb85055d982f77b2276e24631ff0 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 10:14:41 compute-2 podman[227960]: 2026-01-23 10:14:41.423206076 +0000 UTC m=+0.059627311 container exec d31d9134b93b4b3750666f066eb8969836306147460876eda839ac973c41f97e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325)
Jan 23 10:14:41 compute-2 podman[227960]: 2026-01-23 10:14:41.432970973 +0000 UTC m=+0.069392198 container exec_died d31d9134b93b4b3750666f066eb8969836306147460876eda839ac973c41f97e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 10:14:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000020s ======
Jan 23 10:14:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:41.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Jan 23 10:14:41 compute-2 podman[228023]: 2026-01-23 10:14:41.671555862 +0000 UTC m=+0.053806017 container exec c9fabe49ecac94420d698684f7f0ce6311c7531d91804b2e676e15edf98f3e26 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj)
Jan 23 10:14:41 compute-2 podman[228023]: 2026-01-23 10:14:41.698169744 +0000 UTC m=+0.080419899 container exec_died c9fabe49ecac94420d698684f7f0ce6311c7531d91804b2e676e15edf98f3e26 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj)
Jan 23 10:14:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:41 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee80095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:41 compute-2 podman[228090]: 2026-01-23 10:14:41.936384065 +0000 UTC m=+0.058506267 container exec 5f2fa996bf6c111f1884c2f59ff9b058d96f90c6e8935be1b7683b38269ec7aa (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, architecture=x86_64, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, com.redhat.component=keepalived-container, name=keepalived, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph.)
Jan 23 10:14:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:41 compute-2 ceph-mon[75771]: pgmap v634: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:41 compute-2 podman[228090]: 2026-01-23 10:14:41.979101458 +0000 UTC m=+0.101223620 container exec_died 5f2fa996bf6c111f1884c2f59ff9b058d96f90c6e8935be1b7683b38269ec7aa (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, architecture=x86_64, release=1793, build-date=2023-02-22T09:23:20, distribution-scope=public, io.buildah.version=1.28.2, io.openshift.expose-services=, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, vendor=Red Hat, Inc.)
Jan 23 10:14:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:42 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ebc001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:42 compute-2 sudo[227652]: pam_unix(sudo:session): session closed for user root
Jan 23 10:14:42 compute-2 sudo[228161]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:14:42 compute-2 sudo[228161]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:14:42 compute-2 sudo[228161]: pam_unix(sudo:session): session closed for user root
Jan 23 10:14:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:42 compute-2 sudo[228186]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:14:42 compute-2 sudo[228186]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:14:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:42.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:42 compute-2 sudo[228211]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:14:42 compute-2 sudo[228211]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:14:42 compute-2 sudo[228211]: pam_unix(sudo:session): session closed for user root
Jan 23 10:14:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:42 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:43 compute-2 sudo[228186]: pam_unix(sudo:session): session closed for user root
Jan 23 10:14:43 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:14:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000020s ======
Jan 23 10:14:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:43.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Jan 23 10:14:43 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:14:43 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:14:43 compute-2 ceph-mon[75771]: pgmap v635: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:14:43 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 10:14:43 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:14:43 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:14:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:43 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:44 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee80095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000021s ======
Jan 23 10:14:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:44.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000021s
Jan 23 10:14:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:44 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ebc001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:44 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:14:44 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:14:44 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:14:44 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:14:44 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:14:44 compute-2 ceph-mon[75771]: pgmap v636: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:45.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:45 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:46 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:46.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:46 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee80095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:14:47.213485) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163287213706, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 2383, "num_deletes": 251, "total_data_size": 6407276, "memory_usage": 6496112, "flush_reason": "Manual Compaction"}
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Jan 23 10:14:47 compute-2 ceph-mon[75771]: pgmap v637: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163287257759, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 4161547, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20662, "largest_seqno": 23039, "table_properties": {"data_size": 4151860, "index_size": 6117, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 19752, "raw_average_key_size": 20, "raw_value_size": 4132530, "raw_average_value_size": 4225, "num_data_blocks": 268, "num_entries": 978, "num_filter_entries": 978, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163066, "oldest_key_time": 1769163066, "file_creation_time": 1769163287, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 44331 microseconds, and 11063 cpu microseconds.
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:14:47.257860) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 4161547 bytes OK
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:14:47.257902) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:14:47.260308) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:14:47.260368) EVENT_LOG_v1 {"time_micros": 1769163287260363, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:14:47.260390) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 6396764, prev total WAL file size 6396764, number of live WAL files 2.
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:14:47.262228) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031353036' seq:72057594037927935, type:22 .. '7061786F730031373538' seq:0, type:0; will stop at (end)
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(4064KB)], [39(12MB)]
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163287262491, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 17420209, "oldest_snapshot_seqno": -1}
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 5500 keys, 15176937 bytes, temperature: kUnknown
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163287400623, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 15176937, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15137279, "index_size": 24828, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13765, "raw_key_size": 138614, "raw_average_key_size": 25, "raw_value_size": 15034712, "raw_average_value_size": 2733, "num_data_blocks": 1026, "num_entries": 5500, "num_filter_entries": 5500, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769163287, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:14:47.400956) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 15176937 bytes
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:14:47.402579) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 126.0 rd, 109.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 12.6 +0.0 blob) out(14.5 +0.0 blob), read-write-amplify(7.8) write-amplify(3.6) OK, records in: 6018, records dropped: 518 output_compression: NoCompression
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:14:47.402601) EVENT_LOG_v1 {"time_micros": 1769163287402591, "job": 22, "event": "compaction_finished", "compaction_time_micros": 138273, "compaction_time_cpu_micros": 50057, "output_level": 6, "num_output_files": 1, "total_output_size": 15176937, "num_input_records": 6018, "num_output_records": 5500, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163287403775, "job": 22, "event": "table_file_deletion", "file_number": 41}
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163287406800, "job": 22, "event": "table_file_deletion", "file_number": 39}
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:14:47.262009) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:14:47.406937) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:14:47.406944) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:14:47.406947) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:14:47.406949) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:14:47 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:14:47.406999) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:14:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:47.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:47 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ebc001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:48 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003c50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:48 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:14:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:48.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:48 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:49 compute-2 sudo[228272]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:14:49 compute-2 sudo[228272]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:14:49 compute-2 sudo[228272]: pam_unix(sudo:session): session closed for user root
Jan 23 10:14:49 compute-2 ceph-mon[75771]: pgmap v638: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/3340243466' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:14:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/3340243466' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:14:49 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:14:49 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:14:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000021s ======
Jan 23 10:14:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:49.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000021s
Jan 23 10:14:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:49 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:50 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ebc0032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:50.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:14:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:50 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003c70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:51.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:51 compute-2 ceph-mon[75771]: pgmap v639: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:51 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:52 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee80095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000021s ======
Jan 23 10:14:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:52.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000021s
Jan 23 10:14:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:52 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee80095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:53 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:14:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:53.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:53 compute-2 ceph-mon[75771]: pgmap v640: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:14:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:53 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003c90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:54 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000020s ======
Jan 23 10:14:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:54.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Jan 23 10:14:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:54 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:54 compute-2 ceph-mon[75771]: pgmap v641: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:14:55.477 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:14:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:14:55.478 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:14:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:14:55.478 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:14:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000021s ======
Jan 23 10:14:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:55.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000021s
Jan 23 10:14:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:55 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:56 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:56.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:56 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ebc0032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:57.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:57 compute-2 ceph-mon[75771]: pgmap v642: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:14:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:57 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ebc0032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:58 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:58 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:14:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000021s ======
Jan 23 10:14:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:58.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000021s
Jan 23 10:14:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:58 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003cd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:14:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:14:59 compute-2 ceph-mon[75771]: pgmap v643: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:14:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000021s ======
Jan 23 10:14:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:59.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000021s
Jan 23 10:14:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:14:59 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ebc0032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:14:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:00 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee80095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:00 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/315230113' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:15:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:00.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:00 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee80095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:15:01.130044) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163301130075, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 398, "num_deletes": 250, "total_data_size": 509751, "memory_usage": 518048, "flush_reason": "Manual Compaction"}
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163301134281, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 319310, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23044, "largest_seqno": 23437, "table_properties": {"data_size": 316996, "index_size": 478, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 6033, "raw_average_key_size": 19, "raw_value_size": 312329, "raw_average_value_size": 1017, "num_data_blocks": 20, "num_entries": 307, "num_filter_entries": 307, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163288, "oldest_key_time": 1769163288, "file_creation_time": 1769163301, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 4281 microseconds, and 2054 cpu microseconds.
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:15:01.134324) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 319310 bytes OK
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:15:01.134341) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:15:01.135433) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:15:01.135448) EVENT_LOG_v1 {"time_micros": 1769163301135444, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:15:01.135464) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 507171, prev total WAL file size 507171, number of live WAL files 2.
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:15:01.135856) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353033' seq:72057594037927935, type:22 .. '6D67727374617400373534' seq:0, type:0; will stop at (end)
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(311KB)], [42(14MB)]
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163301135882, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 15496247, "oldest_snapshot_seqno": -1}
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 5297 keys, 11393449 bytes, temperature: kUnknown
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163301214777, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 11393449, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11359630, "index_size": 19501, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13253, "raw_key_size": 134813, "raw_average_key_size": 25, "raw_value_size": 11264982, "raw_average_value_size": 2126, "num_data_blocks": 794, "num_entries": 5297, "num_filter_entries": 5297, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769163301, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:15:01.215197) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 11393449 bytes
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:15:01.216834) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 196.1 rd, 144.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 14.5 +0.0 blob) out(10.9 +0.0 blob), read-write-amplify(84.2) write-amplify(35.7) OK, records in: 5807, records dropped: 510 output_compression: NoCompression
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:15:01.216857) EVENT_LOG_v1 {"time_micros": 1769163301216846, "job": 24, "event": "compaction_finished", "compaction_time_micros": 79040, "compaction_time_cpu_micros": 28663, "output_level": 6, "num_output_files": 1, "total_output_size": 11393449, "num_input_records": 5807, "num_output_records": 5297, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163301217290, "job": 24, "event": "table_file_deletion", "file_number": 44}
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163301220744, "job": 24, "event": "table_file_deletion", "file_number": 42}
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:15:01.135797) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:15:01.220830) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:15:01.220836) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:15:01.220837) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:15:01.220839) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:15:01 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:15:01.220841) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:15:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:01.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:01 compute-2 ceph-mon[75771]: pgmap v644: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:15:01 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2458334425' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:15:01 compute-2 nova_compute[225701]: 2026-01-23 10:15:01.792 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:01 compute-2 nova_compute[225701]: 2026-01-23 10:15:01.792 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:01 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003cf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:02 compute-2 nova_compute[225701]: 2026-01-23 10:15:02.001 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:02 compute-2 nova_compute[225701]: 2026-01-23 10:15:02.001 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:15:02 compute-2 nova_compute[225701]: 2026-01-23 10:15:02.001 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:15:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:02 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ebc004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:02 compute-2 nova_compute[225701]: 2026-01-23 10:15:02.229 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:15:02 compute-2 nova_compute[225701]: 2026-01-23 10:15:02.229 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:02 compute-2 nova_compute[225701]: 2026-01-23 10:15:02.229 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:02 compute-2 nova_compute[225701]: 2026-01-23 10:15:02.230 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:02 compute-2 nova_compute[225701]: 2026-01-23 10:15:02.230 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:02 compute-2 nova_compute[225701]: 2026-01-23 10:15:02.230 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:02 compute-2 nova_compute[225701]: 2026-01-23 10:15:02.230 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:15:02 compute-2 nova_compute[225701]: 2026-01-23 10:15:02.231 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:02 compute-2 nova_compute[225701]: 2026-01-23 10:15:02.269 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:15:02 compute-2 nova_compute[225701]: 2026-01-23 10:15:02.269 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:15:02 compute-2 nova_compute[225701]: 2026-01-23 10:15:02.269 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:15:02 compute-2 nova_compute[225701]: 2026-01-23 10:15:02.270 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:15:02 compute-2 nova_compute[225701]: 2026-01-23 10:15:02.270 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:15:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000020s ======
Jan 23 10:15:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:02.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Jan 23 10:15:02 compute-2 sudo[228331]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:15:02 compute-2 sudo[228331]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:15:02 compute-2 sudo[228331]: pam_unix(sudo:session): session closed for user root
Jan 23 10:15:02 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:15:02 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4283179534' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:15:02 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/817666258' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:15:02 compute-2 nova_compute[225701]: 2026-01-23 10:15:02.803 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.533s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:15:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:02 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee80095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:02 compute-2 nova_compute[225701]: 2026-01-23 10:15:02.959 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:15:02 compute-2 nova_compute[225701]: 2026-01-23 10:15:02.960 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5198MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:15:02 compute-2 nova_compute[225701]: 2026-01-23 10:15:02.960 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:15:02 compute-2 nova_compute[225701]: 2026-01-23 10:15:02.960 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:15:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:03 compute-2 nova_compute[225701]: 2026-01-23 10:15:03.030 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:15:03 compute-2 nova_compute[225701]: 2026-01-23 10:15:03.030 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:15:03 compute-2 nova_compute[225701]: 2026-01-23 10:15:03.069 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:15:03 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:15:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:03 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:15:03 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3138463557' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:15:03 compute-2 nova_compute[225701]: 2026-01-23 10:15:03.550 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:15:03 compute-2 nova_compute[225701]: 2026-01-23 10:15:03.555 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:15:03 compute-2 nova_compute[225701]: 2026-01-23 10:15:03.582 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:15:03 compute-2 nova_compute[225701]: 2026-01-23 10:15:03.583 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:15:03 compute-2 nova_compute[225701]: 2026-01-23 10:15:03.584 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.623s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:15:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000020s ======
Jan 23 10:15:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:03.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Jan 23 10:15:03 compute-2 ceph-mon[75771]: pgmap v645: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:15:03 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/4283179534' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:15:03 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3691473830' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:15:03 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3138463557' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:15:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:03 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:04 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003d10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:04 compute-2 nova_compute[225701]: 2026-01-23 10:15:04.136 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:04.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:04 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:04 compute-2 ceph-mon[75771]: pgmap v646: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:15:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000021s ======
Jan 23 10:15:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:05.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000021s
Jan 23 10:15:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:05 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee80095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:06 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:15:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:06 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:06.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:06 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003d10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:07 compute-2 ceph-mon[75771]: pgmap v647: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:15:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:07 compute-2 podman[228387]: 2026-01-23 10:15:07.632094128 +0000 UTC m=+0.053838939 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:15:07 compute-2 podman[228386]: 2026-01-23 10:15:07.670394657 +0000 UTC m=+0.096716304 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 10:15:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:07.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:07 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:08 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee80095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:08 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:15:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:08.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:08 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:09 compute-2 ceph-mon[75771]: pgmap v648: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:15:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000020s ======
Jan 23 10:15:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:09.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Jan 23 10:15:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:09 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:10 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:10.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:10 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee800a640 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:11.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:11 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:11 compute-2 ceph-mon[75771]: pgmap v649: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:15:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:12 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:12.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:12 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:13 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:15:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:13.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:13 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:14 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003dd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:14 compute-2 ceph-mon[75771]: pgmap v650: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:15:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000021s ======
Jan 23 10:15:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:14.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000021s
Jan 23 10:15:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:14 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:15.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:15 compute-2 ceph-mon[75771]: pgmap v651: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:15:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:15 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc003eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:16 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee800a640 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:16.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:16 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:16 compute-2 ceph-mon[75771]: pgmap v652: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:15:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:17.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:17 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:18 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:18.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:18 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:15:18 compute-2 ceph-mon[75771]: pgmap v653: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:15:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:18 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:19.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:19 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4eec001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:20 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40008d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:20.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:20 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:15:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:21.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:21 compute-2 ceph-mon[75771]: pgmap v654: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:15:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:21 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:22 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4eec001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:22.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:22 compute-2 sudo[228447]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:15:22 compute-2 sudo[228447]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:15:22 compute-2 sudo[228447]: pam_unix(sudo:session): session closed for user root
Jan 23 10:15:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:22 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40008d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:23.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:23 compute-2 ceph-mon[75771]: pgmap v655: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:15:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:23 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:24 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc004100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:24 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:15:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:24.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:24 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4eec001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:25.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:25 compute-2 ceph-mon[75771]: pgmap v656: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:15:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:25 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40008d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:26 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:26.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:26 compute-2 ceph-mon[75771]: pgmap v657: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:15:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:26 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc004120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:27.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:27 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4eec001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:28 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed4002bb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:28.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:28 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:29 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:15:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:29.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:29 compute-2 ceph-mon[75771]: pgmap v658: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:15:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:29 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:30 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4eec003860 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:30.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:30 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed4002bb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:30 compute-2 ceph-mon[75771]: pgmap v659: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:15:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000021s ======
Jan 23 10:15:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:31.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000021s
Jan 23 10:15:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:31 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc004160 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:32 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:32.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:32 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4eec003860 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:33 compute-2 ceph-mon[75771]: pgmap v660: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:15:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:33.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:33 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40038c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=404 latency=0.002000041s ======
Jan 23 10:15:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:34.083 +0000] "GET /healthcheck HTTP/1.1" 404 242 - "python-urllib3/1.26.5" - latency=0.002000041s
Jan 23 10:15:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:34 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc004180 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:34 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:15:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:34.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:34 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003f90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:34 compute-2 ceph-mon[75771]: pgmap v661: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:15:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:35.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:35 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4eec003860 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40038c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:36 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:15:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:36.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:36 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc004180 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:37 compute-2 ceph-mon[75771]: pgmap v662: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:15:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000020s ======
Jan 23 10:15:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:37.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Jan 23 10:15:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:37 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003f90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:38 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4eec003860 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:38 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e139 e139: 3 total, 3 up, 3 in
Jan 23 10:15:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:38 compute-2 podman[228489]: 2026-01-23 10:15:38.658099459 +0000 UTC m=+0.072380380 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 10:15:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:38.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:38 compute-2 podman[228488]: 2026-01-23 10:15:38.715198145 +0000 UTC m=+0.132097531 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 10:15:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:38 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40038c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:39 compute-2 ceph-mon[75771]: osdmap e139: 3 total, 3 up, 3 in
Jan 23 10:15:39 compute-2 ceph-mon[75771]: pgmap v664: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 307 B/s rd, 0 op/s
Jan 23 10:15:39 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e140 e140: 3 total, 3 up, 3 in
Jan 23 10:15:39 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:15:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000020s ======
Jan 23 10:15:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:39.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Jan 23 10:15:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:39 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc004180 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:40 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003f90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000021s ======
Jan 23 10:15:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:40.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000021s
Jan 23 10:15:40 compute-2 ceph-mon[75771]: osdmap e140: 3 total, 3 up, 3 in
Jan 23 10:15:40 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e141 e141: 3 total, 3 up, 3 in
Jan 23 10:15:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:40 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4eec003860 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:41 compute-2 ceph-mon[75771]: pgmap v666: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:15:41 compute-2 ceph-mon[75771]: osdmap e141: 3 total, 3 up, 3 in
Jan 23 10:15:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:41.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:41 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40038c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:42 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc004180 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:42.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:42 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e142 e142: 3 total, 3 up, 3 in
Jan 23 10:15:42 compute-2 ceph-mon[75771]: pgmap v668: 353 pgs: 353 active+clean; 21 MiB data, 178 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 3.4 MiB/s wr, 31 op/s
Jan 23 10:15:42 compute-2 sudo[228538]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:15:42 compute-2 sudo[228538]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:15:42 compute-2 sudo[228538]: pam_unix(sudo:session): session closed for user root
Jan 23 10:15:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:42 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec8003f90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:43.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:43 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4eec003860 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:44 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40038c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:44 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:15:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:44.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:44 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40038c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:45 compute-2 ceph-mon[75771]: osdmap e142: 3 total, 3 up, 3 in
Jan 23 10:15:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:45.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:45 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40038c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:46 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e143 e143: 3 total, 3 up, 3 in
Jan 23 10:15:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:46 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40038c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:46 compute-2 ceph-mon[75771]: pgmap v670: 353 pgs: 353 active+clean; 21 MiB data, 178 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 3.4 MiB/s wr, 31 op/s
Jan 23 10:15:46 compute-2 ceph-mon[75771]: osdmap e143: 3 total, 3 up, 3 in
Jan 23 10:15:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:46.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:46 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:47.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:47 compute-2 ceph-mon[75771]: pgmap v672: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 44 KiB/s rd, 6.8 MiB/s wr, 64 op/s
Jan 23 10:15:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:47 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40038c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:48 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4eec003860 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000021s ======
Jan 23 10:15:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:48.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000021s
Jan 23 10:15:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:48 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc004200 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/3735003259' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:15:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/3735003259' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:15:49 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:15:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:49 compute-2 sudo[228569]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:15:49 compute-2 sudo[228569]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:15:49 compute-2 sudo[228569]: pam_unix(sudo:session): session closed for user root
Jan 23 10:15:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:49.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:49 compute-2 sudo[228594]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:15:49 compute-2 sudo[228594]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:15:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:49 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:50 compute-2 ceph-mon[75771]: pgmap v673: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 34 KiB/s rd, 5.3 MiB/s wr, 49 op/s
Jan 23 10:15:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:50 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40038c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:50 compute-2 sudo[228594]: pam_unix(sudo:session): session closed for user root
Jan 23 10:15:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000021s ======
Jan 23 10:15:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:50.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000021s
Jan 23 10:15:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:50 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee80022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:51 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:15:51 compute-2 ceph-mon[75771]: pgmap v674: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 2.6 MiB/s wr, 24 op/s
Jan 23 10:15:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000020s ======
Jan 23 10:15:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:51.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Jan 23 10:15:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:51 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc004290 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:52 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:52.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:52 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40038c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:53 compute-2 ceph-mon[75771]: pgmap v675: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 14 KiB/s rd, 2.1 MiB/s wr, 20 op/s
Jan 23 10:15:53 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:15:53 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:15:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000020s ======
Jan 23 10:15:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:53.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Jan 23 10:15:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:53 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee80022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:54 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc004290 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:54 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:15:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:54.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:54 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:54 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:15:54 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:15:54 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:15:54 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:15:54 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:15:54 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:15:54 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:15:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:15:55.479 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:15:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:15:55.481 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:15:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:15:55.481 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:15:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000020s ======
Jan 23 10:15:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:55.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Jan 23 10:15:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:55 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40038c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:56 compute-2 ceph-mon[75771]: pgmap v676: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 13 KiB/s rd, 2.0 MiB/s wr, 19 op/s
Jan 23 10:15:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:56 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc004290 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:56.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:56 compute-2 nova_compute[225701]: 2026-01-23 10:15:56.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:56 compute-2 nova_compute[225701]: 2026-01-23 10:15:56.785 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 23 10:15:56 compute-2 nova_compute[225701]: 2026-01-23 10:15:56.799 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 23 10:15:56 compute-2 nova_compute[225701]: 2026-01-23 10:15:56.801 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:56 compute-2 nova_compute[225701]: 2026-01-23 10:15:56.801 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 23 10:15:56 compute-2 nova_compute[225701]: 2026-01-23 10:15:56.812 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:56 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee80022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:57 compute-2 ceph-mon[75771]: pgmap v677: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 13 KiB/s rd, 2.0 MiB/s wr, 18 op/s
Jan 23 10:15:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000020s ======
Jan 23 10:15:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:57.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Jan 23 10:15:57 compute-2 nova_compute[225701]: 2026-01-23 10:15:57.816 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:57 compute-2 nova_compute[225701]: 2026-01-23 10:15:57.816 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:57 compute-2 nova_compute[225701]: 2026-01-23 10:15:57.817 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:15:57 compute-2 nova_compute[225701]: 2026-01-23 10:15:57.817 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:15:57 compute-2 nova_compute[225701]: 2026-01-23 10:15:57.831 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:15:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:57 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:58 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40038c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:58.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:58 compute-2 nova_compute[225701]: 2026-01-23 10:15:58.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:58 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40038c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:59 compute-2 sudo[228660]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:15:59 compute-2 sudo[228660]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:15:59 compute-2 sudo[228660]: pam_unix(sudo:session): session closed for user root
Jan 23 10:15:59 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:15:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:15:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:15:59 compute-2 ceph-mon[75771]: pgmap v678: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:15:59 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:15:59 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:15:59 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3650861742' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:15:59 compute-2 nova_compute[225701]: 2026-01-23 10:15:59.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:59 compute-2 nova_compute[225701]: 2026-01-23 10:15:59.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:59 compute-2 nova_compute[225701]: 2026-01-23 10:15:59.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:15:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:15:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000020s ======
Jan 23 10:15:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:59.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Jan 23 10:15:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:15:59 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee8009bb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:15:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:16:00 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee8009bb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:00.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:00 compute-2 nova_compute[225701]: 2026-01-23 10:16:00.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:16:00 compute-2 nova_compute[225701]: 2026-01-23 10:16:00.824 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:16:00 compute-2 nova_compute[225701]: 2026-01-23 10:16:00.824 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:16:00 compute-2 nova_compute[225701]: 2026-01-23 10:16:00.824 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:16:00 compute-2 nova_compute[225701]: 2026-01-23 10:16:00.824 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:16:00 compute-2 nova_compute[225701]: 2026-01-23 10:16:00.825 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:16:00 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3003629658' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:16:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:16:00 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40038c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:00 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:16:00.987 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:16:00 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:16:00.988 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:16:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:01 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:16:01 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/472721166' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:16:01 compute-2 nova_compute[225701]: 2026-01-23 10:16:01.276 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:16:01 compute-2 nova_compute[225701]: 2026-01-23 10:16:01.418 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:16:01 compute-2 nova_compute[225701]: 2026-01-23 10:16:01.419 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5220MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:16:01 compute-2 nova_compute[225701]: 2026-01-23 10:16:01.419 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:16:01 compute-2 nova_compute[225701]: 2026-01-23 10:16:01.420 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:16:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:01 compute-2 nova_compute[225701]: 2026-01-23 10:16:01.624 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:16:01 compute-2 nova_compute[225701]: 2026-01-23 10:16:01.624 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:16:01 compute-2 nova_compute[225701]: 2026-01-23 10:16:01.675 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Refreshing inventories for resource provider db762d15-510c-4120-bfc4-afe76b90b657 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 23 10:16:01 compute-2 nova_compute[225701]: 2026-01-23 10:16:01.733 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Updating ProviderTree inventory for provider db762d15-510c-4120-bfc4-afe76b90b657 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 23 10:16:01 compute-2 nova_compute[225701]: 2026-01-23 10:16:01.733 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Updating inventory in ProviderTree for provider db762d15-510c-4120-bfc4-afe76b90b657 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 10:16:01 compute-2 nova_compute[225701]: 2026-01-23 10:16:01.754 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Refreshing aggregate associations for resource provider db762d15-510c-4120-bfc4-afe76b90b657, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 23 10:16:01 compute-2 nova_compute[225701]: 2026-01-23 10:16:01.775 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Refreshing trait associations for resource provider db762d15-510c-4120-bfc4-afe76b90b657, traits: COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX2,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_F16C,HW_CPU_X86_ABM,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_BMI2,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SVM,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_FDC,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_CLMUL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSSE3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 23 10:16:01 compute-2 nova_compute[225701]: 2026-01-23 10:16:01.791 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:16:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:01.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:01 compute-2 ceph-mon[75771]: pgmap v679: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:16:01 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/472721166' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:16:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:16:01 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc0042b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:01 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:16:01.990 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:16:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:16:02 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc0042b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:02 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:16:02 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4013248704' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:16:02 compute-2 nova_compute[225701]: 2026-01-23 10:16:02.211 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:16:02 compute-2 nova_compute[225701]: 2026-01-23 10:16:02.216 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:16:02 compute-2 nova_compute[225701]: 2026-01-23 10:16:02.232 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:16:02 compute-2 nova_compute[225701]: 2026-01-23 10:16:02.233 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:16:02 compute-2 nova_compute[225701]: 2026-01-23 10:16:02.233 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.814s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:16:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:02.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:02 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/4013248704' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:16:02 compute-2 ceph-mon[75771]: pgmap v680: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:16:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:16:02 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:02 compute-2 sudo[228733]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:16:02 compute-2 sudo[228733]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:16:02 compute-2 sudo[228733]: pam_unix(sudo:session): session closed for user root
Jan 23 10:16:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:03 compute-2 nova_compute[225701]: 2026-01-23 10:16:03.233 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:16:03 compute-2 nova_compute[225701]: 2026-01-23 10:16:03.233 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:16:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:03 compute-2 nova_compute[225701]: 2026-01-23 10:16:03.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:16:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000020s ======
Jan 23 10:16:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:03.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Jan 23 10:16:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/101603 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:16:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:16:03 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40038c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:16:04 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc0042b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:04 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:16:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:04 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/455437505' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:16:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:04.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:16:04 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc0042b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:05 compute-2 ceph-mon[75771]: pgmap v681: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:16:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:16:05 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3264499221' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:16:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000020s ======
Jan 23 10:16:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:05.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Jan 23 10:16:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:16:05 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:16:06 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40038c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:06.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:16:06 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc0042b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:07 compute-2 ceph-mon[75771]: pgmap v682: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:16:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:07.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:16:07 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee8009bb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:16:08 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:08.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:16:08 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40038c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:09 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:16:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:09 compute-2 podman[228765]: 2026-01-23 10:16:09.634691247 +0000 UTC m=+0.059706062 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 10:16:09 compute-2 ceph-mon[75771]: pgmap v683: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:16:09 compute-2 podman[228764]: 2026-01-23 10:16:09.728914987 +0000 UTC m=+0.153059453 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:16:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:09.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:16:09 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc0042d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:16:10 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee8009d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:10.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:16:10 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:11 compute-2 ceph-mon[75771]: pgmap v684: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:16:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:11.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:16:11 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ed40038c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:16:12 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ecc0042f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:16:12 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:16:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:12.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:16:12 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ee8009d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/101613 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:16:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:13 compute-2 ceph-mon[75771]: pgmap v685: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:16:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:13.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:13 compute-2 kernel: ganesha.nfsd[228382]: segfault at 50 ip 00007f4f728fc32e sp 00007f4ee77fd210 error 4 in libntirpc.so.5.8[7f4f728e1000+2c000] likely on CPU 5 (core 0, socket 5)
Jan 23 10:16:13 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 10:16:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[227131]: 23/01/2026 10:16:13 : epoch 697349c4 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ec4003c10 fd 39 proxy ignored for local
Jan 23 10:16:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:14 compute-2 systemd[1]: Started Process Core Dump (PID 228814/UID 0).
Jan 23 10:16:14 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:16:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:16:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:14.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:16:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:15 compute-2 systemd-coredump[228815]: Process 227136 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 60:
                                                    #0  0x00007f4f728fc32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    #1  0x0000000000000000 n/a (n/a + 0x0)
                                                    #2  0x00007f4f72906900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)
                                                    ELF object binary architecture: AMD x86-64
Jan 23 10:16:15 compute-2 systemd[1]: systemd-coredump@10-228814-0.service: Deactivated successfully.
Jan 23 10:16:15 compute-2 systemd[1]: systemd-coredump@10-228814-0.service: Consumed 1.289s CPU time.
Jan 23 10:16:15 compute-2 podman[228821]: 2026-01-23 10:16:15.439030127 +0000 UTC m=+0.036899746 container died d31d9134b93b4b3750666f066eb8969836306147460876eda839ac973c41f97e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Jan 23 10:16:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:15 compute-2 systemd[1]: var-lib-containers-storage-overlay-5102798cab23695af3c9da9dd9606613e6b3ea4e6553d09e76b34a3d22c39e89-merged.mount: Deactivated successfully.
Jan 23 10:16:15 compute-2 podman[228821]: 2026-01-23 10:16:15.82553323 +0000 UTC m=+0.423402849 container remove d31d9134b93b4b3750666f066eb8969836306147460876eda839ac973c41f97e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, ceph=True, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:16:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:16:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:15.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:16:15 compute-2 ceph-mon[75771]: pgmap v686: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:16:15 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Main process exited, code=exited, status=139/n/a
Jan 23 10:16:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:16 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 10:16:16 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.898s CPU time.
Jan 23 10:16:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:16:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:16.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:16:16 compute-2 ceph-mon[75771]: pgmap v687: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 23 10:16:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:17.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:18.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:19 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:16:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:19 compute-2 ceph-mon[75771]: pgmap v688: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 23 10:16:19 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/420910609' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:16:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:19.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/101619 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:16:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [ALERT] 022/101619 (4) : backend 'backend' has no server available!
Jan 23 10:16:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:16:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:20.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:16:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:16:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:21 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e144 e144: 3 total, 3 up, 3 in
Jan 23 10:16:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:16:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:21.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:16:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:22 compute-2 ceph-mon[75771]: pgmap v689: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 23 10:16:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:22.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:22 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e145 e145: 3 total, 3 up, 3 in
Jan 23 10:16:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:23 compute-2 sudo[228872]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:16:23 compute-2 sudo[228872]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:16:23 compute-2 sudo[228872]: pam_unix(sudo:session): session closed for user root
Jan 23 10:16:23 compute-2 ceph-mon[75771]: osdmap e144: 3 total, 3 up, 3 in
Jan 23 10:16:23 compute-2 ceph-mon[75771]: pgmap v691: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 2.1 MiB/s rd, 716 B/s wr, 10 op/s
Jan 23 10:16:23 compute-2 ceph-mon[75771]: osdmap e145: 3 total, 3 up, 3 in
Jan 23 10:16:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:23.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:24 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:16:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:24 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1399276032' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:16:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:24.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:25.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:26 compute-2 ceph-mon[75771]: pgmap v693: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 2.6 MiB/s rd, 383 B/s wr, 11 op/s
Jan 23 10:16:26 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1942849923' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:16:26 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Scheduled restart job, restart counter is at 11.
Jan 23 10:16:26 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:16:26 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.898s CPU time.
Jan 23 10:16:26 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 10:16:26 compute-2 podman[228947]: 2026-01-23 10:16:26.248176283 +0000 UTC m=+0.025264544 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 10:16:26 compute-2 podman[228947]: 2026-01-23 10:16:26.35401501 +0000 UTC m=+0.131103271 container create fa785a85e35a7804c787f20020accc24473951046161ad46c7682dbaa03899c9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 10:16:26 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d753937e541bf38247f02c0eae4c66ab31e8c5f3996cd75a5da7d39e87934a76/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 10:16:26 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d753937e541bf38247f02c0eae4c66ab31e8c5f3996cd75a5da7d39e87934a76/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 10:16:26 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d753937e541bf38247f02c0eae4c66ab31e8c5f3996cd75a5da7d39e87934a76/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:16:26 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d753937e541bf38247f02c0eae4c66ab31e8c5f3996cd75a5da7d39e87934a76/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.tykohi-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:16:26 compute-2 podman[228947]: 2026-01-23 10:16:26.419256712 +0000 UTC m=+0.196344993 container init fa785a85e35a7804c787f20020accc24473951046161ad46c7682dbaa03899c9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Jan 23 10:16:26 compute-2 podman[228947]: 2026-01-23 10:16:26.424011627 +0000 UTC m=+0.201099888 container start fa785a85e35a7804c787f20020accc24473951046161ad46c7682dbaa03899c9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1)
Jan 23 10:16:26 compute-2 bash[228947]: fa785a85e35a7804c787f20020accc24473951046161ad46c7682dbaa03899c9
Jan 23 10:16:26 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:16:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:26 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 10:16:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:26 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 10:16:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:26 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 10:16:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:26 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 10:16:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:26 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 10:16:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:26 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 10:16:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:26 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 10:16:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:26 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:16:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:26.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:27 compute-2 ceph-mon[75771]: pgmap v694: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.6 MiB/s rd, 2.7 MiB/s wr, 54 op/s
Jan 23 10:16:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:16:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:27.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:16:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/101627 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 1 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:16:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:16:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:28.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:16:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:29 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:16:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:16:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:29.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:16:29 compute-2 ceph-mon[75771]: pgmap v695: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.6 MiB/s rd, 2.7 MiB/s wr, 54 op/s
Jan 23 10:16:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:16:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:30.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:16:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:31 compute-2 ceph-mon[75771]: pgmap v696: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 2.4 MiB/s wr, 39 op/s
Jan 23 10:16:31 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 e146: 3 total, 3 up, 3 in
Jan 23 10:16:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:31.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:32 compute-2 ceph-mon[75771]: osdmap e146: 3 total, 3 up, 3 in
Jan 23 10:16:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:16:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:32.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:16:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:32 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Jan 23 10:16:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:32 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Jan 23 10:16:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:32 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:16:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:32 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:16:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:33.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:34 compute-2 ceph-mon[75771]: pgmap v698: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.4 MiB/s rd, 2.2 MiB/s wr, 128 op/s
Jan 23 10:16:34 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:16:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:16:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:34.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:16:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:35 compute-2 ceph-mon[75771]: pgmap v699: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 2.1 MiB/s wr, 124 op/s
Jan 23 10:16:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:16:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/101635 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:16:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:35.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:16:36.215909) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163396216143, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 1300, "num_deletes": 260, "total_data_size": 2950824, "memory_usage": 2987520, "flush_reason": "Manual Compaction"}
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163396231405, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 1940856, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23442, "largest_seqno": 24737, "table_properties": {"data_size": 1935237, "index_size": 2950, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12062, "raw_average_key_size": 19, "raw_value_size": 1923624, "raw_average_value_size": 3077, "num_data_blocks": 130, "num_entries": 625, "num_filter_entries": 625, "num_deletions": 260, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163301, "oldest_key_time": 1769163301, "file_creation_time": 1769163396, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 15519 microseconds, and 7561 cpu microseconds.
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:16:36.231485) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 1940856 bytes OK
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:16:36.231530) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:16:36.233783) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:16:36.233836) EVENT_LOG_v1 {"time_micros": 1769163396233830, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:16:36.233856) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 2944608, prev total WAL file size 2944608, number of live WAL files 2.
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:16:36.235083) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323535' seq:72057594037927935, type:22 .. '6C6F676D00353131' seq:0, type:0; will stop at (end)
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(1895KB)], [45(10MB)]
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163396235487, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 13334305, "oldest_snapshot_seqno": -1}
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 5382 keys, 13142672 bytes, temperature: kUnknown
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163396427777, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 13142672, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13106298, "index_size": 21800, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13509, "raw_key_size": 137814, "raw_average_key_size": 25, "raw_value_size": 13008188, "raw_average_value_size": 2416, "num_data_blocks": 888, "num_entries": 5382, "num_filter_entries": 5382, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769163396, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:16:36.428077) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 13142672 bytes
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:16:36.430264) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 69.3 rd, 68.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 10.9 +0.0 blob) out(12.5 +0.0 blob), read-write-amplify(13.6) write-amplify(6.8) OK, records in: 5922, records dropped: 540 output_compression: NoCompression
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:16:36.430282) EVENT_LOG_v1 {"time_micros": 1769163396430273, "job": 26, "event": "compaction_finished", "compaction_time_micros": 192375, "compaction_time_cpu_micros": 61083, "output_level": 6, "num_output_files": 1, "total_output_size": 13142672, "num_input_records": 5922, "num_output_records": 5382, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163396430700, "job": 26, "event": "table_file_deletion", "file_number": 47}
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163396432604, "job": 26, "event": "table_file_deletion", "file_number": 45}
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:16:36.234840) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:16:36.432738) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:16:36.432745) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:16:36.432746) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:16:36.432748) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:16:36 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:16:36.432750) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:16:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:36.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:37 compute-2 ceph-mon[75771]: pgmap v700: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 16 KiB/s wr, 92 op/s
Jan 23 10:16:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:37.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:16:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:38.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-000000000000001f:nfs.cephfs.1: -2
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:16:39 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:39.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:39 compute-2 ceph-mon[75771]: pgmap v701: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 16 KiB/s wr, 92 op/s
Jan 23 10:16:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ef4000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:40 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:40 compute-2 podman[229035]: 2026-01-23 10:16:40.625133581 +0000 UTC m=+0.045301930 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:16:40 compute-2 podman[229034]: 2026-01-23 10:16:40.658669694 +0000 UTC m=+0.086960370 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 10:16:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:16:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:40.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:16:40 compute-2 ceph-mon[75771]: pgmap v702: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 16 KiB/s wr, 92 op/s
Jan 23 10:16:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:40 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:16:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:41.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:16:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/101641 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:16:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:41 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec001e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:42 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:42.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:42 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc001680 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:43 compute-2 sudo[229083]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:16:43 compute-2 sudo[229083]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:16:43 compute-2 sudo[229083]: pam_unix(sudo:session): session closed for user root
Jan 23 10:16:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:43 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Jan 23 10:16:43 compute-2 ceph-mon[75771]: pgmap v703: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 548 KiB/s rd, 1.2 KiB/s wr, 30 op/s
Jan 23 10:16:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:43.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:43 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:44 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec002910 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:44 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:16:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:44.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:44 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:45 compute-2 ceph-mon[75771]: pgmap v704: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 29 KiB/s rd, 767 B/s wr, 9 op/s
Jan 23 10:16:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:45.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:45 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc001680 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:46 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc001680 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:46.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:46 compute-2 ceph-mon[75771]: pgmap v705: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 391 KiB/s rd, 2.1 MiB/s wr, 68 op/s
Jan 23 10:16:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:46 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec002910 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:47.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:47 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec002910 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:48 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:48.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:48 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc001680 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:49 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:16:49 compute-2 ceph-mon[75771]: pgmap v706: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 389 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Jan 23 10:16:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:16:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:49.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:16:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:49 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec002910 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:50 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:50.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:50 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/3378120931' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:16:50 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/3378120931' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:16:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:16:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:50 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:51.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:51 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:52 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec002910 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:52 compute-2 ceph-mon[75771]: pgmap v707: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 389 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Jan 23 10:16:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:52.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:52 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:53 compute-2 ceph-mon[75771]: pgmap v708: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 390 KiB/s rd, 2.1 MiB/s wr, 67 op/s
Jan 23 10:16:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:53.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:53 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:54 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:54 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:16:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:54.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:54 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec002910 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:16:55.481 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:16:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:16:55.481 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:16:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:16:55.482 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:16:55 compute-2 ceph-mon[75771]: pgmap v709: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 363 KiB/s rd, 2.1 MiB/s wr, 59 op/s
Jan 23 10:16:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:55.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:55 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:56 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:56.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:56 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:57 compute-2 nova_compute[225701]: 2026-01-23 10:16:57.779 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:16:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:57.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:57 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec002910 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:58 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed00032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:58 compute-2 ceph-mon[75771]: pgmap v710: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 368 KiB/s rd, 2.1 MiB/s wr, 59 op/s
Jan 23 10:16:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:58.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:58 compute-2 nova_compute[225701]: 2026-01-23 10:16:58.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:16:58 compute-2 nova_compute[225701]: 2026-01-23 10:16:58.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:16:58 compute-2 nova_compute[225701]: 2026-01-23 10:16:58.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:16:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:16:58 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:58 compute-2 nova_compute[225701]: 2026-01-23 10:16:58.973 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:16:58 compute-2 nova_compute[225701]: 2026-01-23 10:16:58.974 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:16:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:16:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:59 compute-2 sudo[229124]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:16:59 compute-2 sudo[229124]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:16:59 compute-2 sudo[229124]: pam_unix(sudo:session): session closed for user root
Jan 23 10:16:59 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:16:59 compute-2 sudo[229149]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:16:59 compute-2 sudo[229149]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:16:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:16:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:16:59 compute-2 nova_compute[225701]: 2026-01-23 10:16:59.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:16:59 compute-2 nova_compute[225701]: 2026-01-23 10:16:59.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:16:59 compute-2 sudo[229149]: pam_unix(sudo:session): session closed for user root
Jan 23 10:16:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:16:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:59.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:00 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:00 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec002910 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:00.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:00 compute-2 nova_compute[225701]: 2026-01-23 10:17:00.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:17:00 compute-2 nova_compute[225701]: 2026-01-23 10:17:00.815 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:17:00 compute-2 nova_compute[225701]: 2026-01-23 10:17:00.816 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:17:00 compute-2 nova_compute[225701]: 2026-01-23 10:17:00.816 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:17:00 compute-2 nova_compute[225701]: 2026-01-23 10:17:00.816 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:17:00 compute-2 nova_compute[225701]: 2026-01-23 10:17:00.817 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:17:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:00 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec002910 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:01 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:01.134 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:17:01 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:01.135 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:17:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:01.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:02 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec002910 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:02 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:02.138 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:17:02 compute-2 ceph-mon[75771]: pgmap v711: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 6.2 KiB/s rd, 12 KiB/s wr, 1 op/s
Jan 23 10:17:02 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1666916957' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:02 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:02 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:17:02 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2232532070' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:02 compute-2 nova_compute[225701]: 2026-01-23 10:17:02.601 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.784s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:17:02 compute-2 nova_compute[225701]: 2026-01-23 10:17:02.742 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:17:02 compute-2 nova_compute[225701]: 2026-01-23 10:17:02.743 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5194MB free_disk=59.942752838134766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:17:02 compute-2 nova_compute[225701]: 2026-01-23 10:17:02.743 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:17:02 compute-2 nova_compute[225701]: 2026-01-23 10:17:02.744 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:17:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:17:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:02.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:17:02 compute-2 nova_compute[225701]: 2026-01-23 10:17:02.822 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:17:02 compute-2 nova_compute[225701]: 2026-01-23 10:17:02.823 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:17:02 compute-2 nova_compute[225701]: 2026-01-23 10:17:02.843 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:17:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:02 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed00032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:03 compute-2 ceph-mon[75771]: pgmap v712: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 6.2 KiB/s rd, 12 KiB/s wr, 1 op/s
Jan 23 10:17:03 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1117145556' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:03 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3419588943' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:03 compute-2 ceph-mon[75771]: pgmap v713: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 6.5 KiB/s rd, 16 KiB/s wr, 1 op/s
Jan 23 10:17:03 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2232532070' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:03 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:17:03 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:17:03 compute-2 sudo[229252]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:17:03 compute-2 sudo[229252]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:17:03 compute-2 sudo[229252]: pam_unix(sudo:session): session closed for user root
Jan 23 10:17:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:03 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:17:03 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2443746606' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:03 compute-2 nova_compute[225701]: 2026-01-23 10:17:03.506 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.663s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:17:03 compute-2 nova_compute[225701]: 2026-01-23 10:17:03.513 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:17:03 compute-2 nova_compute[225701]: 2026-01-23 10:17:03.534 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:17:03 compute-2 nova_compute[225701]: 2026-01-23 10:17:03.537 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:17:03 compute-2 nova_compute[225701]: 2026-01-23 10:17:03.538 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.794s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:17:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:17:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:03.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:17:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:04 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec002910 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:04 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:04 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:17:04 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:17:04 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2443746606' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:04 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:17:04 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:17:04 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:17:04 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:17:04 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:17:04 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:17:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:04 compute-2 nova_compute[225701]: 2026-01-23 10:17:04.538 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:17:04 compute-2 nova_compute[225701]: 2026-01-23 10:17:04.563 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:17:04 compute-2 nova_compute[225701]: 2026-01-23 10:17:04.564 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:17:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:04.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:04 compute-2 nova_compute[225701]: 2026-01-23 10:17:04.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:17:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:04 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:05 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1679498423' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:05 compute-2 ceph-mon[75771]: pgmap v714: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 5.6 KiB/s rd, 3.3 KiB/s wr, 0 op/s
Jan 23 10:17:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:17:05 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/703034000' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:05 compute-2 nova_compute[225701]: 2026-01-23 10:17:05.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:17:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:17:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:05.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:17:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:06 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:06 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed00032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:17:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:06.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:17:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:06 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed00032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:07 compute-2 nova_compute[225701]: 2026-01-23 10:17:07.214 225706 DEBUG oslo_concurrency.lockutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "c07a4c22-2b21-4f01-9038-2a522a89b3b1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:17:07 compute-2 nova_compute[225701]: 2026-01-23 10:17:07.214 225706 DEBUG oslo_concurrency.lockutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "c07a4c22-2b21-4f01-9038-2a522a89b3b1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:17:07 compute-2 nova_compute[225701]: 2026-01-23 10:17:07.360 225706 DEBUG nova.compute.manager [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 10:17:07 compute-2 nova_compute[225701]: 2026-01-23 10:17:07.456 225706 DEBUG oslo_concurrency.lockutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:17:07 compute-2 nova_compute[225701]: 2026-01-23 10:17:07.457 225706 DEBUG oslo_concurrency.lockutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:17:07 compute-2 nova_compute[225701]: 2026-01-23 10:17:07.463 225706 DEBUG nova.virt.hardware [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 10:17:07 compute-2 nova_compute[225701]: 2026-01-23 10:17:07.464 225706 INFO nova.compute.claims [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Claim successful on node compute-2.ctlplane.example.com
Jan 23 10:17:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:07 compute-2 nova_compute[225701]: 2026-01-23 10:17:07.627 225706 DEBUG oslo_concurrency.processutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:17:07 compute-2 ceph-mon[75771]: pgmap v715: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 5.6 KiB/s rd, 3.3 KiB/s wr, 0 op/s
Jan 23 10:17:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:07.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:08 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec002910 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:08 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:17:08 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1232319574' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:08 compute-2 nova_compute[225701]: 2026-01-23 10:17:08.096 225706 DEBUG oslo_concurrency.processutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:17:08 compute-2 nova_compute[225701]: 2026-01-23 10:17:08.108 225706 DEBUG nova.compute.provider_tree [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:17:08 compute-2 nova_compute[225701]: 2026-01-23 10:17:08.138 225706 DEBUG nova.scheduler.client.report [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:17:08 compute-2 nova_compute[225701]: 2026-01-23 10:17:08.189 225706 DEBUG oslo_concurrency.lockutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.732s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:17:08 compute-2 nova_compute[225701]: 2026-01-23 10:17:08.190 225706 DEBUG nova.compute.manager [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 10:17:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:08 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:08 compute-2 nova_compute[225701]: 2026-01-23 10:17:08.267 225706 DEBUG nova.compute.manager [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 10:17:08 compute-2 nova_compute[225701]: 2026-01-23 10:17:08.268 225706 DEBUG nova.network.neutron [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 10:17:08 compute-2 nova_compute[225701]: 2026-01-23 10:17:08.337 225706 INFO nova.virt.libvirt.driver [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 10:17:08 compute-2 nova_compute[225701]: 2026-01-23 10:17:08.388 225706 DEBUG nova.compute.manager [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 10:17:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:08 compute-2 nova_compute[225701]: 2026-01-23 10:17:08.560 225706 DEBUG nova.compute.manager [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 10:17:08 compute-2 nova_compute[225701]: 2026-01-23 10:17:08.561 225706 DEBUG nova.virt.libvirt.driver [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 10:17:08 compute-2 nova_compute[225701]: 2026-01-23 10:17:08.562 225706 INFO nova.virt.libvirt.driver [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Creating image(s)
Jan 23 10:17:08 compute-2 nova_compute[225701]: 2026-01-23 10:17:08.591 225706 DEBUG nova.storage.rbd_utils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image c07a4c22-2b21-4f01-9038-2a522a89b3b1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:17:08 compute-2 nova_compute[225701]: 2026-01-23 10:17:08.622 225706 DEBUG nova.storage.rbd_utils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image c07a4c22-2b21-4f01-9038-2a522a89b3b1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:17:08 compute-2 nova_compute[225701]: 2026-01-23 10:17:08.650 225706 DEBUG nova.storage.rbd_utils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image c07a4c22-2b21-4f01-9038-2a522a89b3b1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:17:08 compute-2 nova_compute[225701]: 2026-01-23 10:17:08.653 225706 DEBUG oslo_concurrency.lockutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "379b2821245bc82aa5a95839eddb9a97716b559c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:17:08 compute-2 nova_compute[225701]: 2026-01-23 10:17:08.654 225706 DEBUG oslo_concurrency.lockutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:17:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:08.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:08 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1232319574' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:08 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:09 compute-2 nova_compute[225701]: 2026-01-23 10:17:09.008 225706 DEBUG nova.virt.libvirt.imagebackend [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image locations are: [{'url': 'rbd://f3005f84-239a-55b6-a948-8f1fb592b920/images/271ec98e-d058-421b-bbfb-4b4a5954c90a/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://f3005f84-239a-55b6-a948-8f1fb592b920/images/271ec98e-d058-421b-bbfb-4b4a5954c90a/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085
Jan 23 10:17:09 compute-2 nova_compute[225701]: 2026-01-23 10:17:09.137 225706 WARNING oslo_policy.policy [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Jan 23 10:17:09 compute-2 nova_compute[225701]: 2026-01-23 10:17:09.137 225706 WARNING oslo_policy.policy [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Jan 23 10:17:09 compute-2 nova_compute[225701]: 2026-01-23 10:17:09.140 225706 DEBUG nova.policy [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f459c4e71e6c47acb0f8aaf83f34695e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 10:17:09 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:17:09 compute-2 sudo[229361]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:17:09 compute-2 sudo[229361]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:17:09 compute-2 sudo[229361]: pam_unix(sudo:session): session closed for user root
Jan 23 10:17:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:09.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:10 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:10 compute-2 ceph-mon[75771]: pgmap v716: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 3.3 KiB/s wr, 0 op/s
Jan 23 10:17:10 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:17:10 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:17:10 compute-2 nova_compute[225701]: 2026-01-23 10:17:10.191 225706 DEBUG oslo_concurrency.processutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:17:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:10 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec002910 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:10 compute-2 nova_compute[225701]: 2026-01-23 10:17:10.255 225706 DEBUG oslo_concurrency.processutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c.part --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:17:10 compute-2 nova_compute[225701]: 2026-01-23 10:17:10.257 225706 DEBUG nova.virt.images [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] 271ec98e-d058-421b-bbfb-4b4a5954c90a was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Jan 23 10:17:10 compute-2 nova_compute[225701]: 2026-01-23 10:17:10.258 225706 DEBUG nova.privsep.utils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 23 10:17:10 compute-2 nova_compute[225701]: 2026-01-23 10:17:10.259 225706 DEBUG oslo_concurrency.processutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c.part /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:17:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:10.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:10 compute-2 nova_compute[225701]: 2026-01-23 10:17:10.776 225706 DEBUG oslo_concurrency.processutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c.part /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c.converted" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:17:10 compute-2 nova_compute[225701]: 2026-01-23 10:17:10.784 225706 DEBUG oslo_concurrency.processutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:17:10 compute-2 nova_compute[225701]: 2026-01-23 10:17:10.842 225706 DEBUG oslo_concurrency.processutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c.converted --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:17:10 compute-2 nova_compute[225701]: 2026-01-23 10:17:10.843 225706 DEBUG oslo_concurrency.lockutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:17:10 compute-2 nova_compute[225701]: 2026-01-23 10:17:10.869 225706 DEBUG nova.storage.rbd_utils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image c07a4c22-2b21-4f01-9038-2a522a89b3b1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:17:10 compute-2 nova_compute[225701]: 2026-01-23 10:17:10.872 225706 DEBUG oslo_concurrency.processutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c c07a4c22-2b21-4f01-9038-2a522a89b3b1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:17:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:10 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed00032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:11 compute-2 ceph-mon[75771]: pgmap v717: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 3.3 KiB/s wr, 0 op/s
Jan 23 10:17:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:11 compute-2 nova_compute[225701]: 2026-01-23 10:17:11.635 225706 DEBUG nova.network.neutron [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Successfully created port: 6d05af08-c0fc-42c9-a5aa-eea4c33255a9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 10:17:11 compute-2 podman[229440]: 2026-01-23 10:17:11.643617986 +0000 UTC m=+0.059757960 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 10:17:11 compute-2 podman[229439]: 2026-01-23 10:17:11.662657058 +0000 UTC m=+0.082272157 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 10:17:11 compute-2 nova_compute[225701]: 2026-01-23 10:17:11.687 225706 DEBUG oslo_concurrency.processutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c c07a4c22-2b21-4f01-9038-2a522a89b3b1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.814s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:17:11 compute-2 nova_compute[225701]: 2026-01-23 10:17:11.753 225706 DEBUG nova.storage.rbd_utils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] resizing rbd image c07a4c22-2b21-4f01-9038-2a522a89b3b1_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Jan 23 10:17:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:11.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:12 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec8000ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:12 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:12 compute-2 nova_compute[225701]: 2026-01-23 10:17:12.262 225706 DEBUG nova.objects.instance [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'migration_context' on Instance uuid c07a4c22-2b21-4f01-9038-2a522a89b3b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:17:12 compute-2 nova_compute[225701]: 2026-01-23 10:17:12.292 225706 DEBUG nova.virt.libvirt.driver [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 10:17:12 compute-2 nova_compute[225701]: 2026-01-23 10:17:12.293 225706 DEBUG nova.virt.libvirt.driver [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Ensure instance console log exists: /var/lib/nova/instances/c07a4c22-2b21-4f01-9038-2a522a89b3b1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 10:17:12 compute-2 nova_compute[225701]: 2026-01-23 10:17:12.293 225706 DEBUG oslo_concurrency.lockutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:17:12 compute-2 nova_compute[225701]: 2026-01-23 10:17:12.293 225706 DEBUG oslo_concurrency.lockutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:17:12 compute-2 nova_compute[225701]: 2026-01-23 10:17:12.294 225706 DEBUG oslo_concurrency.lockutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:17:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:17:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:12.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:17:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:12 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:13 compute-2 ceph-mon[75771]: pgmap v718: 353 pgs: 353 active+clean; 132 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 399 KiB/s wr, 21 op/s
Jan 23 10:17:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:17:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:13.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:17:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:14 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed00032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:14 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80019e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:14 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:17:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:14.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:14 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:15 compute-2 nova_compute[225701]: 2026-01-23 10:17:15.216 225706 DEBUG nova.network.neutron [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Successfully updated port: 6d05af08-c0fc-42c9-a5aa-eea4c33255a9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 10:17:15 compute-2 nova_compute[225701]: 2026-01-23 10:17:15.272 225706 DEBUG oslo_concurrency.lockutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "refresh_cache-c07a4c22-2b21-4f01-9038-2a522a89b3b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:17:15 compute-2 nova_compute[225701]: 2026-01-23 10:17:15.273 225706 DEBUG oslo_concurrency.lockutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquired lock "refresh_cache-c07a4c22-2b21-4f01-9038-2a522a89b3b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:17:15 compute-2 nova_compute[225701]: 2026-01-23 10:17:15.273 225706 DEBUG nova.network.neutron [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 10:17:15 compute-2 nova_compute[225701]: 2026-01-23 10:17:15.365 225706 DEBUG nova.compute.manager [req-a96878ab-5390-45e7-9d67-e13e83030dc8 req-2e50b125-aa5d-4942-a79c-48899e1639c3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Received event network-changed-6d05af08-c0fc-42c9-a5aa-eea4c33255a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:17:15 compute-2 nova_compute[225701]: 2026-01-23 10:17:15.365 225706 DEBUG nova.compute.manager [req-a96878ab-5390-45e7-9d67-e13e83030dc8 req-2e50b125-aa5d-4942-a79c-48899e1639c3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Refreshing instance network info cache due to event network-changed-6d05af08-c0fc-42c9-a5aa-eea4c33255a9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 10:17:15 compute-2 nova_compute[225701]: 2026-01-23 10:17:15.365 225706 DEBUG oslo_concurrency.lockutils [req-a96878ab-5390-45e7-9d67-e13e83030dc8 req-2e50b125-aa5d-4942-a79c-48899e1639c3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-c07a4c22-2b21-4f01-9038-2a522a89b3b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:17:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:15.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:15 compute-2 ceph-mon[75771]: pgmap v719: 353 pgs: 353 active+clean; 132 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 396 KiB/s wr, 21 op/s
Jan 23 10:17:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:16 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:16 compute-2 nova_compute[225701]: 2026-01-23 10:17:16.214 225706 DEBUG nova.network.neutron [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 10:17:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:16 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed00032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:16.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:16 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:17 compute-2 ceph-mon[75771]: pgmap v720: 353 pgs: 353 active+clean; 167 MiB data, 301 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 34 op/s
Jan 23 10:17:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:17.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:18 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:18 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:18.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:18 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed00032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:19 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:17:19 compute-2 nova_compute[225701]: 2026-01-23 10:17:19.277 225706 DEBUG nova.network.neutron [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Updating instance_info_cache with network_info: [{"id": "6d05af08-c0fc-42c9-a5aa-eea4c33255a9", "address": "fa:16:3e:5f:ea:43", "network": {"id": "ea2c9056-8899-490c-806c-edc7669d1876", "bridge": "br-int", "label": "tempest-network-smoke--139118822", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d05af08-c0", "ovs_interfaceid": "6d05af08-c0fc-42c9-a5aa-eea4c33255a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:17:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:19 compute-2 nova_compute[225701]: 2026-01-23 10:17:19.692 225706 DEBUG oslo_concurrency.lockutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Releasing lock "refresh_cache-c07a4c22-2b21-4f01-9038-2a522a89b3b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:17:19 compute-2 nova_compute[225701]: 2026-01-23 10:17:19.692 225706 DEBUG nova.compute.manager [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Instance network_info: |[{"id": "6d05af08-c0fc-42c9-a5aa-eea4c33255a9", "address": "fa:16:3e:5f:ea:43", "network": {"id": "ea2c9056-8899-490c-806c-edc7669d1876", "bridge": "br-int", "label": "tempest-network-smoke--139118822", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d05af08-c0", "ovs_interfaceid": "6d05af08-c0fc-42c9-a5aa-eea4c33255a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 10:17:19 compute-2 nova_compute[225701]: 2026-01-23 10:17:19.693 225706 DEBUG oslo_concurrency.lockutils [req-a96878ab-5390-45e7-9d67-e13e83030dc8 req-2e50b125-aa5d-4942-a79c-48899e1639c3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-c07a4c22-2b21-4f01-9038-2a522a89b3b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:17:19 compute-2 nova_compute[225701]: 2026-01-23 10:17:19.693 225706 DEBUG nova.network.neutron [req-a96878ab-5390-45e7-9d67-e13e83030dc8 req-2e50b125-aa5d-4942-a79c-48899e1639c3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Refreshing network info cache for port 6d05af08-c0fc-42c9-a5aa-eea4c33255a9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 10:17:19 compute-2 nova_compute[225701]: 2026-01-23 10:17:19.696 225706 DEBUG nova.virt.libvirt.driver [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Start _get_guest_xml network_info=[{"id": "6d05af08-c0fc-42c9-a5aa-eea4c33255a9", "address": "fa:16:3e:5f:ea:43", "network": {"id": "ea2c9056-8899-490c-806c-edc7669d1876", "bridge": "br-int", "label": "tempest-network-smoke--139118822", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d05af08-c0", "ovs_interfaceid": "6d05af08-c0fc-42c9-a5aa-eea4c33255a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_options': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '271ec98e-d058-421b-bbfb-4b4a5954c90a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 10:17:19 compute-2 nova_compute[225701]: 2026-01-23 10:17:19.701 225706 WARNING nova.virt.libvirt.driver [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:17:19 compute-2 nova_compute[225701]: 2026-01-23 10:17:19.708 225706 DEBUG nova.virt.libvirt.host [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 10:17:19 compute-2 nova_compute[225701]: 2026-01-23 10:17:19.708 225706 DEBUG nova.virt.libvirt.host [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 10:17:19 compute-2 nova_compute[225701]: 2026-01-23 10:17:19.712 225706 DEBUG nova.virt.libvirt.host [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 10:17:19 compute-2 nova_compute[225701]: 2026-01-23 10:17:19.713 225706 DEBUG nova.virt.libvirt.host [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 10:17:19 compute-2 nova_compute[225701]: 2026-01-23 10:17:19.714 225706 DEBUG nova.virt.libvirt.driver [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 10:17:19 compute-2 nova_compute[225701]: 2026-01-23 10:17:19.714 225706 DEBUG nova.virt.hardware [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T10:15:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1d8c8bf4-786e-4009-bc53-f259480fb5b3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 10:17:19 compute-2 nova_compute[225701]: 2026-01-23 10:17:19.715 225706 DEBUG nova.virt.hardware [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 10:17:19 compute-2 nova_compute[225701]: 2026-01-23 10:17:19.716 225706 DEBUG nova.virt.hardware [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 10:17:19 compute-2 nova_compute[225701]: 2026-01-23 10:17:19.716 225706 DEBUG nova.virt.hardware [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 10:17:19 compute-2 nova_compute[225701]: 2026-01-23 10:17:19.716 225706 DEBUG nova.virt.hardware [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 10:17:19 compute-2 nova_compute[225701]: 2026-01-23 10:17:19.717 225706 DEBUG nova.virt.hardware [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 10:17:19 compute-2 ceph-mon[75771]: pgmap v721: 353 pgs: 353 active+clean; 167 MiB data, 301 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 34 op/s
Jan 23 10:17:19 compute-2 nova_compute[225701]: 2026-01-23 10:17:19.718 225706 DEBUG nova.virt.hardware [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 10:17:19 compute-2 nova_compute[225701]: 2026-01-23 10:17:19.718 225706 DEBUG nova.virt.hardware [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 10:17:19 compute-2 nova_compute[225701]: 2026-01-23 10:17:19.719 225706 DEBUG nova.virt.hardware [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 10:17:19 compute-2 nova_compute[225701]: 2026-01-23 10:17:19.719 225706 DEBUG nova.virt.hardware [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 10:17:19 compute-2 nova_compute[225701]: 2026-01-23 10:17:19.720 225706 DEBUG nova.virt.hardware [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 10:17:19 compute-2 nova_compute[225701]: 2026-01-23 10:17:19.729 225706 DEBUG nova.privsep.utils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 23 10:17:19 compute-2 nova_compute[225701]: 2026-01-23 10:17:19.730 225706 DEBUG oslo_concurrency.processutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:17:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:19.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:20 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:20 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 10:17:20 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2560479743' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.196 225706 DEBUG oslo_concurrency.processutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.220 225706 DEBUG nova.storage.rbd_utils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image c07a4c22-2b21-4f01-9038-2a522a89b3b1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.223 225706 DEBUG oslo_concurrency.processutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:17:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:20 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:20 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 10:17:20 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3266007736' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.655 225706 DEBUG oslo_concurrency.processutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.657 225706 DEBUG nova.virt.libvirt.vif [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:17:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1495306657',display_name='tempest-TestNetworkBasicOps-server-1495306657',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1495306657',id=2,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLxxDi55rJrmjd9tPx3gB/xixK1x2HRLnmG9Byb+stnwWJq6KflxlWAuyclQ+qi9LouKqZ2XGS+GdJbd7h/CxxTRs6viENTRpetaAiEPpt6tQjNJzUsbg7BLt/d0b2D9Pg==',key_name='tempest-TestNetworkBasicOps-1853917503',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-n0x2ufhy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:17:08Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=c07a4c22-2b21-4f01-9038-2a522a89b3b1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6d05af08-c0fc-42c9-a5aa-eea4c33255a9", "address": "fa:16:3e:5f:ea:43", "network": {"id": "ea2c9056-8899-490c-806c-edc7669d1876", "bridge": "br-int", "label": "tempest-network-smoke--139118822", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d05af08-c0", "ovs_interfaceid": "6d05af08-c0fc-42c9-a5aa-eea4c33255a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.657 225706 DEBUG nova.network.os_vif_util [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "6d05af08-c0fc-42c9-a5aa-eea4c33255a9", "address": "fa:16:3e:5f:ea:43", "network": {"id": "ea2c9056-8899-490c-806c-edc7669d1876", "bridge": "br-int", "label": "tempest-network-smoke--139118822", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d05af08-c0", "ovs_interfaceid": "6d05af08-c0fc-42c9-a5aa-eea4c33255a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.658 225706 DEBUG nova.network.os_vif_util [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:ea:43,bridge_name='br-int',has_traffic_filtering=True,id=6d05af08-c0fc-42c9-a5aa-eea4c33255a9,network=Network(ea2c9056-8899-490c-806c-edc7669d1876),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d05af08-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.661 225706 DEBUG nova.objects.instance [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'pci_devices' on Instance uuid c07a4c22-2b21-4f01-9038-2a522a89b3b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.699 225706 DEBUG nova.virt.libvirt.driver [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] End _get_guest_xml xml=<domain type="kvm">
Jan 23 10:17:20 compute-2 nova_compute[225701]:   <uuid>c07a4c22-2b21-4f01-9038-2a522a89b3b1</uuid>
Jan 23 10:17:20 compute-2 nova_compute[225701]:   <name>instance-00000002</name>
Jan 23 10:17:20 compute-2 nova_compute[225701]:   <memory>131072</memory>
Jan 23 10:17:20 compute-2 nova_compute[225701]:   <vcpu>1</vcpu>
Jan 23 10:17:20 compute-2 nova_compute[225701]:   <metadata>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <nova:name>tempest-TestNetworkBasicOps-server-1495306657</nova:name>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <nova:creationTime>2026-01-23 10:17:19</nova:creationTime>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <nova:flavor name="m1.nano">
Jan 23 10:17:20 compute-2 nova_compute[225701]:         <nova:memory>128</nova:memory>
Jan 23 10:17:20 compute-2 nova_compute[225701]:         <nova:disk>1</nova:disk>
Jan 23 10:17:20 compute-2 nova_compute[225701]:         <nova:swap>0</nova:swap>
Jan 23 10:17:20 compute-2 nova_compute[225701]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 10:17:20 compute-2 nova_compute[225701]:         <nova:vcpus>1</nova:vcpus>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       </nova:flavor>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <nova:owner>
Jan 23 10:17:20 compute-2 nova_compute[225701]:         <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 10:17:20 compute-2 nova_compute[225701]:         <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       </nova:owner>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <nova:ports>
Jan 23 10:17:20 compute-2 nova_compute[225701]:         <nova:port uuid="6d05af08-c0fc-42c9-a5aa-eea4c33255a9">
Jan 23 10:17:20 compute-2 nova_compute[225701]:           <nova:ip type="fixed" address="10.100.0.22" ipVersion="4"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:         </nova:port>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       </nova:ports>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     </nova:instance>
Jan 23 10:17:20 compute-2 nova_compute[225701]:   </metadata>
Jan 23 10:17:20 compute-2 nova_compute[225701]:   <sysinfo type="smbios">
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <system>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <entry name="manufacturer">RDO</entry>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <entry name="product">OpenStack Compute</entry>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <entry name="serial">c07a4c22-2b21-4f01-9038-2a522a89b3b1</entry>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <entry name="uuid">c07a4c22-2b21-4f01-9038-2a522a89b3b1</entry>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <entry name="family">Virtual Machine</entry>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     </system>
Jan 23 10:17:20 compute-2 nova_compute[225701]:   </sysinfo>
Jan 23 10:17:20 compute-2 nova_compute[225701]:   <os>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <boot dev="hd"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <smbios mode="sysinfo"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:   </os>
Jan 23 10:17:20 compute-2 nova_compute[225701]:   <features>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <acpi/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <apic/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <vmcoreinfo/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:   </features>
Jan 23 10:17:20 compute-2 nova_compute[225701]:   <clock offset="utc">
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <timer name="hpet" present="no"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:   </clock>
Jan 23 10:17:20 compute-2 nova_compute[225701]:   <cpu mode="host-model" match="exact">
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:   </cpu>
Jan 23 10:17:20 compute-2 nova_compute[225701]:   <devices>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <disk type="network" device="disk">
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <driver type="raw" cache="none"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <source protocol="rbd" name="vms/c07a4c22-2b21-4f01-9038-2a522a89b3b1_disk">
Jan 23 10:17:20 compute-2 nova_compute[225701]:         <host name="192.168.122.100" port="6789"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:         <host name="192.168.122.102" port="6789"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:         <host name="192.168.122.101" port="6789"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       </source>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <auth username="openstack">
Jan 23 10:17:20 compute-2 nova_compute[225701]:         <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       </auth>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <target dev="vda" bus="virtio"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     </disk>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <disk type="network" device="cdrom">
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <driver type="raw" cache="none"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <source protocol="rbd" name="vms/c07a4c22-2b21-4f01-9038-2a522a89b3b1_disk.config">
Jan 23 10:17:20 compute-2 nova_compute[225701]:         <host name="192.168.122.100" port="6789"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:         <host name="192.168.122.102" port="6789"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:         <host name="192.168.122.101" port="6789"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       </source>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <auth username="openstack">
Jan 23 10:17:20 compute-2 nova_compute[225701]:         <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       </auth>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <target dev="sda" bus="sata"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     </disk>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <interface type="ethernet">
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <mac address="fa:16:3e:5f:ea:43"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <model type="virtio"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <mtu size="1442"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <target dev="tap6d05af08-c0"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     </interface>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <serial type="pty">
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <log file="/var/lib/nova/instances/c07a4c22-2b21-4f01-9038-2a522a89b3b1/console.log" append="off"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     </serial>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <video>
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <model type="virtio"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     </video>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <input type="tablet" bus="usb"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <rng model="virtio">
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <backend model="random">/dev/urandom</backend>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     </rng>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <controller type="usb" index="0"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     <memballoon model="virtio">
Jan 23 10:17:20 compute-2 nova_compute[225701]:       <stats period="10"/>
Jan 23 10:17:20 compute-2 nova_compute[225701]:     </memballoon>
Jan 23 10:17:20 compute-2 nova_compute[225701]:   </devices>
Jan 23 10:17:20 compute-2 nova_compute[225701]: </domain>
Jan 23 10:17:20 compute-2 nova_compute[225701]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.701 225706 DEBUG nova.compute.manager [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Preparing to wait for external event network-vif-plugged-6d05af08-c0fc-42c9-a5aa-eea4c33255a9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.701 225706 DEBUG oslo_concurrency.lockutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "c07a4c22-2b21-4f01-9038-2a522a89b3b1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.701 225706 DEBUG oslo_concurrency.lockutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "c07a4c22-2b21-4f01-9038-2a522a89b3b1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.701 225706 DEBUG oslo_concurrency.lockutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "c07a4c22-2b21-4f01-9038-2a522a89b3b1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.702 225706 DEBUG nova.virt.libvirt.vif [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:17:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1495306657',display_name='tempest-TestNetworkBasicOps-server-1495306657',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1495306657',id=2,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLxxDi55rJrmjd9tPx3gB/xixK1x2HRLnmG9Byb+stnwWJq6KflxlWAuyclQ+qi9LouKqZ2XGS+GdJbd7h/CxxTRs6viENTRpetaAiEPpt6tQjNJzUsbg7BLt/d0b2D9Pg==',key_name='tempest-TestNetworkBasicOps-1853917503',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-n0x2ufhy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:17:08Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=c07a4c22-2b21-4f01-9038-2a522a89b3b1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6d05af08-c0fc-42c9-a5aa-eea4c33255a9", "address": "fa:16:3e:5f:ea:43", "network": {"id": "ea2c9056-8899-490c-806c-edc7669d1876", "bridge": "br-int", "label": "tempest-network-smoke--139118822", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d05af08-c0", "ovs_interfaceid": "6d05af08-c0fc-42c9-a5aa-eea4c33255a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.703 225706 DEBUG nova.network.os_vif_util [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "6d05af08-c0fc-42c9-a5aa-eea4c33255a9", "address": "fa:16:3e:5f:ea:43", "network": {"id": "ea2c9056-8899-490c-806c-edc7669d1876", "bridge": "br-int", "label": "tempest-network-smoke--139118822", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d05af08-c0", "ovs_interfaceid": "6d05af08-c0fc-42c9-a5aa-eea4c33255a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.703 225706 DEBUG nova.network.os_vif_util [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:ea:43,bridge_name='br-int',has_traffic_filtering=True,id=6d05af08-c0fc-42c9-a5aa-eea4c33255a9,network=Network(ea2c9056-8899-490c-806c-edc7669d1876),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d05af08-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.703 225706 DEBUG os_vif [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:ea:43,bridge_name='br-int',has_traffic_filtering=True,id=6d05af08-c0fc-42c9-a5aa-eea4c33255a9,network=Network(ea2c9056-8899-490c-806c-edc7669d1876),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d05af08-c0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.741 225706 DEBUG ovsdbapp.backend.ovs_idl [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.741 225706 DEBUG ovsdbapp.backend.ovs_idl [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.743 225706 DEBUG ovsdbapp.backend.ovs_idl [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.743 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.744 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [POLLOUT] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.744 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.745 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.747 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.749 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.760 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.761 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.761 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 10:17:20 compute-2 nova_compute[225701]: 2026-01-23 10:17:20.762 225706 INFO oslo.privsep.daemon [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp0zb0sw2r/privsep.sock']
Jan 23 10:17:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:20.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:17:20 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2560479743' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:17:20 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3266007736' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:17:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:20 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:21 compute-2 nova_compute[225701]: 2026-01-23 10:17:21.391 225706 INFO oslo.privsep.daemon [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Spawned new privsep daemon via rootwrap
Jan 23 10:17:21 compute-2 nova_compute[225701]: 2026-01-23 10:17:21.273 229634 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 23 10:17:21 compute-2 nova_compute[225701]: 2026-01-23 10:17:21.277 229634 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 23 10:17:21 compute-2 nova_compute[225701]: 2026-01-23 10:17:21.279 229634 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Jan 23 10:17:21 compute-2 nova_compute[225701]: 2026-01-23 10:17:21.279 229634 INFO oslo.privsep.daemon [-] privsep daemon running as pid 229634
Jan 23 10:17:21 compute-2 nova_compute[225701]: 2026-01-23 10:17:21.444 225706 DEBUG nova.network.neutron [req-a96878ab-5390-45e7-9d67-e13e83030dc8 req-2e50b125-aa5d-4942-a79c-48899e1639c3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Updated VIF entry in instance network info cache for port 6d05af08-c0fc-42c9-a5aa-eea4c33255a9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 10:17:21 compute-2 nova_compute[225701]: 2026-01-23 10:17:21.445 225706 DEBUG nova.network.neutron [req-a96878ab-5390-45e7-9d67-e13e83030dc8 req-2e50b125-aa5d-4942-a79c-48899e1639c3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Updating instance_info_cache with network_info: [{"id": "6d05af08-c0fc-42c9-a5aa-eea4c33255a9", "address": "fa:16:3e:5f:ea:43", "network": {"id": "ea2c9056-8899-490c-806c-edc7669d1876", "bridge": "br-int", "label": "tempest-network-smoke--139118822", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d05af08-c0", "ovs_interfaceid": "6d05af08-c0fc-42c9-a5aa-eea4c33255a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:17:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:21 compute-2 nova_compute[225701]: 2026-01-23 10:17:21.580 225706 DEBUG oslo_concurrency.lockutils [req-a96878ab-5390-45e7-9d67-e13e83030dc8 req-2e50b125-aa5d-4942-a79c-48899e1639c3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-c07a4c22-2b21-4f01-9038-2a522a89b3b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:17:21 compute-2 nova_compute[225701]: 2026-01-23 10:17:21.722 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:21 compute-2 nova_compute[225701]: 2026-01-23 10:17:21.722 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6d05af08-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:17:21 compute-2 nova_compute[225701]: 2026-01-23 10:17:21.723 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6d05af08-c0, col_values=(('external_ids', {'iface-id': '6d05af08-c0fc-42c9-a5aa-eea4c33255a9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5f:ea:43', 'vm-uuid': 'c07a4c22-2b21-4f01-9038-2a522a89b3b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:17:21 compute-2 nova_compute[225701]: 2026-01-23 10:17:21.725 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:21 compute-2 NetworkManager[48964]: <info>  [1769163441.7270] manager: (tap6d05af08-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Jan 23 10:17:21 compute-2 nova_compute[225701]: 2026-01-23 10:17:21.728 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 10:17:21 compute-2 nova_compute[225701]: 2026-01-23 10:17:21.732 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:21 compute-2 nova_compute[225701]: 2026-01-23 10:17:21.732 225706 INFO os_vif [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:ea:43,bridge_name='br-int',has_traffic_filtering=True,id=6d05af08-c0fc-42c9-a5aa-eea4c33255a9,network=Network(ea2c9056-8899-490c-806c-edc7669d1876),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d05af08-c0')
Jan 23 10:17:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:17:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:21.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:17:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:22 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed00032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:22 compute-2 ceph-mon[75771]: pgmap v722: 353 pgs: 353 active+clean; 167 MiB data, 301 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 34 op/s
Jan 23 10:17:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:22 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:22 compute-2 nova_compute[225701]: 2026-01-23 10:17:22.562 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:17:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:22.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:17:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:22 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:23 compute-2 ceph-mon[75771]: pgmap v723: 353 pgs: 353 active+clean; 167 MiB data, 301 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 34 op/s
Jan 23 10:17:23 compute-2 sudo[229642]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:17:23 compute-2 sudo[229642]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:17:23 compute-2 sudo[229642]: pam_unix(sudo:session): session closed for user root
Jan 23 10:17:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:23.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:24 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:24 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed00032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:24 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:17:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:24 compute-2 nova_compute[225701]: 2026-01-23 10:17:24.517 225706 DEBUG nova.virt.libvirt.driver [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 10:17:24 compute-2 nova_compute[225701]: 2026-01-23 10:17:24.517 225706 DEBUG nova.virt.libvirt.driver [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 10:17:24 compute-2 nova_compute[225701]: 2026-01-23 10:17:24.518 225706 DEBUG nova.virt.libvirt.driver [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No VIF found with MAC fa:16:3e:5f:ea:43, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 10:17:24 compute-2 nova_compute[225701]: 2026-01-23 10:17:24.518 225706 INFO nova.virt.libvirt.driver [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Using config drive
Jan 23 10:17:24 compute-2 nova_compute[225701]: 2026-01-23 10:17:24.544 225706 DEBUG nova.storage.rbd_utils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image c07a4c22-2b21-4f01-9038-2a522a89b3b1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:17:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:17:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:24.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:17:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:24 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:25.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:26 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:26 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:26 compute-2 nova_compute[225701]: 2026-01-23 10:17:26.347 225706 INFO nova.virt.libvirt.driver [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Creating config drive at /var/lib/nova/instances/c07a4c22-2b21-4f01-9038-2a522a89b3b1/disk.config
Jan 23 10:17:26 compute-2 nova_compute[225701]: 2026-01-23 10:17:26.352 225706 DEBUG oslo_concurrency.processutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c07a4c22-2b21-4f01-9038-2a522a89b3b1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl9c7_p5_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:17:26 compute-2 ceph-mon[75771]: pgmap v724: 353 pgs: 353 active+clean; 167 MiB data, 301 MiB used, 60 GiB / 60 GiB avail; 9.2 KiB/s rd, 1.4 MiB/s wr, 13 op/s
Jan 23 10:17:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:26 compute-2 nova_compute[225701]: 2026-01-23 10:17:26.485 225706 DEBUG oslo_concurrency.processutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c07a4c22-2b21-4f01-9038-2a522a89b3b1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl9c7_p5_" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:17:26 compute-2 nova_compute[225701]: 2026-01-23 10:17:26.519 225706 DEBUG nova.storage.rbd_utils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image c07a4c22-2b21-4f01-9038-2a522a89b3b1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:17:26 compute-2 nova_compute[225701]: 2026-01-23 10:17:26.522 225706 DEBUG oslo_concurrency.processutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c07a4c22-2b21-4f01-9038-2a522a89b3b1/disk.config c07a4c22-2b21-4f01-9038-2a522a89b3b1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:17:26 compute-2 nova_compute[225701]: 2026-01-23 10:17:26.725 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:26 compute-2 nova_compute[225701]: 2026-01-23 10:17:26.738 225706 DEBUG oslo_concurrency.processutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c07a4c22-2b21-4f01-9038-2a522a89b3b1/disk.config c07a4c22-2b21-4f01-9038-2a522a89b3b1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.216s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:17:26 compute-2 nova_compute[225701]: 2026-01-23 10:17:26.739 225706 INFO nova.virt.libvirt.driver [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Deleting local config drive /var/lib/nova/instances/c07a4c22-2b21-4f01-9038-2a522a89b3b1/disk.config because it was imported into RBD.
Jan 23 10:17:26 compute-2 systemd[1]: Starting libvirt secret daemon...
Jan 23 10:17:26 compute-2 systemd[1]: Started libvirt secret daemon.
Jan 23 10:17:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:26.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:26 compute-2 kernel: tun: Universal TUN/TAP device driver, 1.6
Jan 23 10:17:26 compute-2 kernel: tap6d05af08-c0: entered promiscuous mode
Jan 23 10:17:26 compute-2 NetworkManager[48964]: <info>  [1769163446.8285] manager: (tap6d05af08-c0): new Tun device (/org/freedesktop/NetworkManager/Devices/24)
Jan 23 10:17:26 compute-2 ovn_controller[132789]: 2026-01-23T10:17:26Z|00027|binding|INFO|Claiming lport 6d05af08-c0fc-42c9-a5aa-eea4c33255a9 for this chassis.
Jan 23 10:17:26 compute-2 ovn_controller[132789]: 2026-01-23T10:17:26Z|00028|binding|INFO|6d05af08-c0fc-42c9-a5aa-eea4c33255a9: Claiming fa:16:3e:5f:ea:43 10.100.0.22
Jan 23 10:17:26 compute-2 nova_compute[225701]: 2026-01-23 10:17:26.829 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:26 compute-2 nova_compute[225701]: 2026-01-23 10:17:26.834 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:26 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:26.844 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:ea:43 10.100.0.22'], port_security=['fa:16:3e:5f:ea:43 10.100.0.22'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.22/28', 'neutron:device_id': 'c07a4c22-2b21-4f01-9038-2a522a89b3b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ea2c9056-8899-490c-806c-edc7669d1876', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fe078e1b-317f-4f89-baf9-dae03f7c432d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c711f457-cdcf-449c-8bb3-bfedc76a2abf, chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>], logical_port=6d05af08-c0fc-42c9-a5aa-eea4c33255a9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:17:26 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:26.845 142606 INFO neutron.agent.ovn.metadata.agent [-] Port 6d05af08-c0fc-42c9-a5aa-eea4c33255a9 in datapath ea2c9056-8899-490c-806c-edc7669d1876 bound to our chassis
Jan 23 10:17:26 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:26.848 142606 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ea2c9056-8899-490c-806c-edc7669d1876
Jan 23 10:17:26 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:26.849 142606 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpvlep2ode/privsep.sock']
Jan 23 10:17:26 compute-2 systemd-udevd[229766]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 10:17:26 compute-2 systemd-machined[194368]: New machine qemu-1-instance-00000002.
Jan 23 10:17:26 compute-2 nova_compute[225701]: 2026-01-23 10:17:26.885 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:26 compute-2 systemd[1]: Started Virtual Machine qemu-1-instance-00000002.
Jan 23 10:17:26 compute-2 ovn_controller[132789]: 2026-01-23T10:17:26Z|00029|binding|INFO|Setting lport 6d05af08-c0fc-42c9-a5aa-eea4c33255a9 ovn-installed in OVS
Jan 23 10:17:26 compute-2 ovn_controller[132789]: 2026-01-23T10:17:26Z|00030|binding|INFO|Setting lport 6d05af08-c0fc-42c9-a5aa-eea4c33255a9 up in Southbound
Jan 23 10:17:26 compute-2 nova_compute[225701]: 2026-01-23 10:17:26.890 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:26 compute-2 NetworkManager[48964]: <info>  [1769163446.9100] device (tap6d05af08-c0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 10:17:26 compute-2 NetworkManager[48964]: <info>  [1769163446.9105] device (tap6d05af08-c0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 10:17:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:26 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed00032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.155 225706 DEBUG nova.compute.manager [req-dcf362ae-6f87-4903-b01b-57014a65b087 req-7edc982f-cbbf-4bf3-8bef-c47590eb1ac0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Received event network-vif-plugged-6d05af08-c0fc-42c9-a5aa-eea4c33255a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.155 225706 DEBUG oslo_concurrency.lockutils [req-dcf362ae-6f87-4903-b01b-57014a65b087 req-7edc982f-cbbf-4bf3-8bef-c47590eb1ac0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "c07a4c22-2b21-4f01-9038-2a522a89b3b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.156 225706 DEBUG oslo_concurrency.lockutils [req-dcf362ae-6f87-4903-b01b-57014a65b087 req-7edc982f-cbbf-4bf3-8bef-c47590eb1ac0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "c07a4c22-2b21-4f01-9038-2a522a89b3b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.156 225706 DEBUG oslo_concurrency.lockutils [req-dcf362ae-6f87-4903-b01b-57014a65b087 req-7edc982f-cbbf-4bf3-8bef-c47590eb1ac0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "c07a4c22-2b21-4f01-9038-2a522a89b3b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.156 225706 DEBUG nova.compute.manager [req-dcf362ae-6f87-4903-b01b-57014a65b087 req-7edc982f-cbbf-4bf3-8bef-c47590eb1ac0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Processing event network-vif-plugged-6d05af08-c0fc-42c9-a5aa-eea4c33255a9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 10:17:27 compute-2 ceph-mon[75771]: pgmap v725: 353 pgs: 353 active+clean; 167 MiB data, 301 MiB used, 60 GiB / 60 GiB avail; 9.2 KiB/s rd, 1.4 MiB/s wr, 13 op/s
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.460 225706 DEBUG nova.virt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Emitting event <LifecycleEvent: 1769163447.4600725, c07a4c22-2b21-4f01-9038-2a522a89b3b1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.462 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] VM Started (Lifecycle Event)
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.464 225706 DEBUG nova.compute.manager [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.468 225706 DEBUG nova.virt.libvirt.driver [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.478 225706 INFO nova.virt.libvirt.driver [-] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Instance spawned successfully.
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.479 225706 DEBUG nova.virt.libvirt.driver [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 10:17:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.483 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.486 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.506 225706 DEBUG nova.virt.libvirt.driver [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.507 225706 DEBUG nova.virt.libvirt.driver [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.507 225706 DEBUG nova.virt.libvirt.driver [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.508 225706 DEBUG nova.virt.libvirt.driver [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.508 225706 DEBUG nova.virt.libvirt.driver [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.508 225706 DEBUG nova.virt.libvirt.driver [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.511 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.512 225706 DEBUG nova.virt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Emitting event <LifecycleEvent: 1769163447.460298, c07a4c22-2b21-4f01-9038-2a522a89b3b1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.512 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] VM Paused (Lifecycle Event)
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.536 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.540 225706 DEBUG nova.virt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Emitting event <LifecycleEvent: 1769163447.4671593, c07a4c22-2b21-4f01-9038-2a522a89b3b1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.541 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] VM Resumed (Lifecycle Event)
Jan 23 10:17:27 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:27.545 142606 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 23 10:17:27 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:27.546 142606 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpvlep2ode/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 23 10:17:27 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:27.419 229823 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 23 10:17:27 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:27.425 229823 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 23 10:17:27 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:27.427 229823 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Jan 23 10:17:27 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:27.427 229823 INFO oslo.privsep.daemon [-] privsep daemon running as pid 229823
Jan 23 10:17:27 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:27.549 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[15385bd7-fe7c-4d1c-9875-776a9c1c28b1]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.565 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.566 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.569 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.577 225706 INFO nova.compute.manager [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Took 19.02 seconds to spawn the instance on the hypervisor.
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.578 225706 DEBUG nova.compute.manager [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.590 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.679 225706 INFO nova.compute.manager [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Took 20.24 seconds to build instance.
Jan 23 10:17:27 compute-2 nova_compute[225701]: 2026-01-23 10:17:27.695 225706 DEBUG oslo_concurrency.lockutils [None req-f262554d-786d-4fa9-8e73-1d24d9728e13 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "c07a4c22-2b21-4f01-9038-2a522a89b3b1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.481s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:17:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:27.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:28 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:28 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:28.166 229823 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:17:28 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:28.166 229823 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:17:28 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:28.167 229823 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:17:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:28 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:28.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:28 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:28.897 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[3ae7e659-f985-4a40-b55d-f4b7d9f11d5c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:17:28 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:28.899 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapea2c9056-81 in ovnmeta-ea2c9056-8899-490c-806c-edc7669d1876 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 10:17:28 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:28.901 229823 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapea2c9056-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 10:17:28 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:28.901 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[16b48071-2ca3-49ff-baa8-4b8fa35d6ec2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:17:28 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:28.904 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[ea9c3453-0767-46af-8237-a90fe2752275]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:17:28 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:28.933 142723 DEBUG oslo.privsep.daemon [-] privsep: reply[f4755c0f-16e6-4120-9385-96804308f645]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:17:28 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:28.959 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[87326fc5-d25a-41a6-945c-722cb0d62ac5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:17:28 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:28.962 142606 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpyb8dzjtc/privsep.sock']
Jan 23 10:17:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:28 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:29 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:17:29 compute-2 nova_compute[225701]: 2026-01-23 10:17:29.360 225706 DEBUG nova.compute.manager [req-c189256d-0f23-4681-b53a-65ee88acbbeb req-df4a3abc-26f4-4206-957c-dd7cab831d89 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Received event network-vif-plugged-6d05af08-c0fc-42c9-a5aa-eea4c33255a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:17:29 compute-2 nova_compute[225701]: 2026-01-23 10:17:29.361 225706 DEBUG oslo_concurrency.lockutils [req-c189256d-0f23-4681-b53a-65ee88acbbeb req-df4a3abc-26f4-4206-957c-dd7cab831d89 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "c07a4c22-2b21-4f01-9038-2a522a89b3b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:17:29 compute-2 nova_compute[225701]: 2026-01-23 10:17:29.361 225706 DEBUG oslo_concurrency.lockutils [req-c189256d-0f23-4681-b53a-65ee88acbbeb req-df4a3abc-26f4-4206-957c-dd7cab831d89 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "c07a4c22-2b21-4f01-9038-2a522a89b3b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:17:29 compute-2 nova_compute[225701]: 2026-01-23 10:17:29.361 225706 DEBUG oslo_concurrency.lockutils [req-c189256d-0f23-4681-b53a-65ee88acbbeb req-df4a3abc-26f4-4206-957c-dd7cab831d89 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "c07a4c22-2b21-4f01-9038-2a522a89b3b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:17:29 compute-2 nova_compute[225701]: 2026-01-23 10:17:29.362 225706 DEBUG nova.compute.manager [req-c189256d-0f23-4681-b53a-65ee88acbbeb req-df4a3abc-26f4-4206-957c-dd7cab831d89 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] No waiting events found dispatching network-vif-plugged-6d05af08-c0fc-42c9-a5aa-eea4c33255a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:17:29 compute-2 nova_compute[225701]: 2026-01-23 10:17:29.362 225706 WARNING nova.compute.manager [req-c189256d-0f23-4681-b53a-65ee88acbbeb req-df4a3abc-26f4-4206-957c-dd7cab831d89 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Received unexpected event network-vif-plugged-6d05af08-c0fc-42c9-a5aa-eea4c33255a9 for instance with vm_state active and task_state None.
Jan 23 10:17:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:29.774 142606 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 23 10:17:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:29.775 142606 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpyb8dzjtc/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 23 10:17:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:29.513 229840 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 23 10:17:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:29.518 229840 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 23 10:17:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:29.520 229840 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 23 10:17:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:29.520 229840 INFO oslo.privsep.daemon [-] privsep daemon running as pid 229840
Jan 23 10:17:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:29.778 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[b3c63aff-b225-42b9-9fab-3899ad8026ec]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:17:29 compute-2 ceph-mon[75771]: pgmap v726: 353 pgs: 353 active+clean; 167 MiB data, 301 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:17:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:17:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:29.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:17:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:30 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed00032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:30 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed00032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:30 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:30.258 229840 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:17:30 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:30.258 229840 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:17:30 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:30.259 229840 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:17:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:30.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:30 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:30.839 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[9bae9025-986c-49f1-8cef-2bd059819d8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:17:30 compute-2 NetworkManager[48964]: <info>  [1769163450.8556] manager: (tapea2c9056-80): new Veth device (/org/freedesktop/NetworkManager/Devices/25)
Jan 23 10:17:30 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:30.854 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[f55c8de8-9a29-4544-81cf-453dccc9202e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:17:30 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:30.883 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[1bc76e82-3b28-44ce-b2b8-ab5b3aba0020]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:17:30 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:30.888 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[ec5cdbce-3a16-4cd4-ba2a-60d94f364ce9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:17:30 compute-2 systemd-udevd[229854]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 10:17:30 compute-2 NetworkManager[48964]: <info>  [1769163450.9100] device (tapea2c9056-80): carrier: link connected
Jan 23 10:17:30 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:30.917 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[f64a4d70-c9c5-4481-b620-c0869f3c353c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:17:30 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:30.935 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[b8631173-0968-44f8-8b97-890378ad098d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapea2c9056-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:50:61:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 460355, 'reachable_time': 23630, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229872, 'error': None, 'target': 'ovnmeta-ea2c9056-8899-490c-806c-edc7669d1876', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:17:30 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:30.949 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[d8d13779-e4b3-4c4e-8786-bd6f789a096a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe50:61da'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 460355, 'tstamp': 460355}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229873, 'error': None, 'target': 'ovnmeta-ea2c9056-8899-490c-806c-edc7669d1876', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:17:30 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:30.965 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[4a9ecfca-23ef-48e1-9d95-cd10ba1ee9d5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapea2c9056-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:50:61:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 460355, 'reachable_time': 23630, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229874, 'error': None, 'target': 'ovnmeta-ea2c9056-8899-490c-806c-edc7669d1876', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:17:30 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:30.992 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[7a808403-2e9f-4f2e-bb0a-64ceaf4e157b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:17:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:31 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:31.051 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[1f751999-de50-4ac4-9b1e-d1b6d066e3c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:31.054 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea2c9056-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:31.054 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:31.055 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapea2c9056-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:17:31 compute-2 nova_compute[225701]: 2026-01-23 10:17:31.057 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:31 compute-2 kernel: tapea2c9056-80: entered promiscuous mode
Jan 23 10:17:31 compute-2 NetworkManager[48964]: <info>  [1769163451.0576] manager: (tapea2c9056-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:31.063 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapea2c9056-80, col_values=(('external_ids', {'iface-id': '208bdd6d-ea94-4760-8733-b7a2f7e2e275'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:17:31 compute-2 nova_compute[225701]: 2026-01-23 10:17:31.065 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:31 compute-2 ovn_controller[132789]: 2026-01-23T10:17:31Z|00031|binding|INFO|Releasing lport 208bdd6d-ea94-4760-8733-b7a2f7e2e275 from this chassis (sb_readonly=0)
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:31.069 142606 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ea2c9056-8899-490c-806c-edc7669d1876.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ea2c9056-8899-490c-806c-edc7669d1876.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:31.070 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[5a1d8f0f-2bff-4300-81ac-5ee3243b5095]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:31.072 142606 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]: global
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]:     log         /dev/log local0 debug
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]:     log-tag     haproxy-metadata-proxy-ea2c9056-8899-490c-806c-edc7669d1876
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]:     user        root
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]:     group       root
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]:     maxconn     1024
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]:     pidfile     /var/lib/neutron/external/pids/ea2c9056-8899-490c-806c-edc7669d1876.pid.haproxy
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]:     daemon
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]: 
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]: defaults
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]:     log global
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]:     mode http
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]:     option httplog
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]:     option dontlognull
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]:     option http-server-close
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]:     option forwardfor
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]:     retries                 3
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]:     timeout http-request    30s
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]:     timeout connect         30s
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]:     timeout client          32s
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]:     timeout server          32s
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]:     timeout http-keep-alive 30s
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]: 
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]: 
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]: listen listener
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]:     bind 169.254.169.254:80
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]:     http-request add-header X-OVN-Network-ID ea2c9056-8899-490c-806c-edc7669d1876
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 10:17:31 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:31.075 142606 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ea2c9056-8899-490c-806c-edc7669d1876', 'env', 'PROCESS_TAG=haproxy-ea2c9056-8899-490c-806c-edc7669d1876', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ea2c9056-8899-490c-806c-edc7669d1876.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 10:17:31 compute-2 nova_compute[225701]: 2026-01-23 10:17:31.078 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:31 compute-2 podman[229907]: 2026-01-23 10:17:31.494222991 +0000 UTC m=+0.029554287 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 10:17:31 compute-2 ceph-mon[75771]: pgmap v727: 353 pgs: 353 active+clean; 167 MiB data, 301 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:17:31 compute-2 podman[229907]: 2026-01-23 10:17:31.621339344 +0000 UTC m=+0.156670590 container create 3105a5232b365273d9c35f6e56a1fd2e8510d5a68b643b81eccb01faa11ebfe3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-ea2c9056-8899-490c-806c-edc7669d1876, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 10:17:31 compute-2 nova_compute[225701]: 2026-01-23 10:17:31.727 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:31 compute-2 systemd[1]: Started libpod-conmon-3105a5232b365273d9c35f6e56a1fd2e8510d5a68b643b81eccb01faa11ebfe3.scope.
Jan 23 10:17:31 compute-2 systemd[1]: Started libcrun container.
Jan 23 10:17:31 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bf7888edb03887a6ed0797d837078ec13d5e4829a7c47eb6c5b9736fe7a2eb5f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 10:17:31 compute-2 podman[229907]: 2026-01-23 10:17:31.904062691 +0000 UTC m=+0.439393957 container init 3105a5232b365273d9c35f6e56a1fd2e8510d5a68b643b81eccb01faa11ebfe3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-ea2c9056-8899-490c-806c-edc7669d1876, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 10:17:31 compute-2 podman[229907]: 2026-01-23 10:17:31.910857875 +0000 UTC m=+0.446189121 container start 3105a5232b365273d9c35f6e56a1fd2e8510d5a68b643b81eccb01faa11ebfe3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-ea2c9056-8899-490c-806c-edc7669d1876, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:17:31 compute-2 neutron-haproxy-ovnmeta-ea2c9056-8899-490c-806c-edc7669d1876[229922]: [NOTICE]   (229926) : New worker (229928) forked
Jan 23 10:17:31 compute-2 neutron-haproxy-ovnmeta-ea2c9056-8899-490c-806c-edc7669d1876[229922]: [NOTICE]   (229926) : Loading success.
Jan 23 10:17:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:17:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:31.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:17:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:32 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:32 compute-2 NetworkManager[48964]: <info>  [1769163452.0957] manager: (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/27)
Jan 23 10:17:32 compute-2 nova_compute[225701]: 2026-01-23 10:17:32.095 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:32 compute-2 NetworkManager[48964]: <info>  [1769163452.0973] device (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 10:17:32 compute-2 NetworkManager[48964]: <warn>  [1769163452.0974] device (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 10:17:32 compute-2 NetworkManager[48964]: <info>  [1769163452.0982] manager: (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/28)
Jan 23 10:17:32 compute-2 NetworkManager[48964]: <info>  [1769163452.0986] device (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 10:17:32 compute-2 NetworkManager[48964]: <warn>  [1769163452.0986] device (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 10:17:32 compute-2 NetworkManager[48964]: <info>  [1769163452.0998] manager: (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Jan 23 10:17:32 compute-2 NetworkManager[48964]: <info>  [1769163452.1005] manager: (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Jan 23 10:17:32 compute-2 NetworkManager[48964]: <info>  [1769163452.1010] device (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 23 10:17:32 compute-2 NetworkManager[48964]: <info>  [1769163452.1013] device (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 23 10:17:32 compute-2 nova_compute[225701]: 2026-01-23 10:17:32.168 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:32 compute-2 ovn_controller[132789]: 2026-01-23T10:17:32Z|00032|binding|INFO|Releasing lport 208bdd6d-ea94-4760-8733-b7a2f7e2e275 from this chassis (sb_readonly=0)
Jan 23 10:17:32 compute-2 nova_compute[225701]: 2026-01-23 10:17:32.177 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:32 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:32 compute-2 nova_compute[225701]: 2026-01-23 10:17:32.567 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:32.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:33 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed00032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:33 compute-2 ceph-mon[75771]: pgmap v728: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 21 KiB/s wr, 75 op/s
Jan 23 10:17:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:33.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:34 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003c30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:34 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:34 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:17:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:34.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:35 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:35 compute-2 ceph-mon[75771]: pgmap v729: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 21 KiB/s wr, 75 op/s
Jan 23 10:17:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:17:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:35.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:36 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed00032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:36 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:36 compute-2 nova_compute[225701]: 2026-01-23 10:17:36.730 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:17:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:36.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:17:36 compute-2 ceph-mon[75771]: pgmap v730: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 22 KiB/s wr, 75 op/s
Jan 23 10:17:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:37 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:37 compute-2 nova_compute[225701]: 2026-01-23 10:17:37.570 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:37.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:38 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:38 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed00032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:38.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003c70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:39 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:17:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:39 compute-2 ceph-mon[75771]: pgmap v731: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 22 KiB/s wr, 75 op/s
Jan 23 10:17:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:17:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:39.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:17:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:40 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:40 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:40.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:41 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed00032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:41 compute-2 ovn_controller[132789]: 2026-01-23T10:17:41Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5f:ea:43 10.100.0.22
Jan 23 10:17:41 compute-2 ovn_controller[132789]: 2026-01-23T10:17:41Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5f:ea:43 10.100.0.22
Jan 23 10:17:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:41 compute-2 ceph-mon[75771]: pgmap v732: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 22 KiB/s wr, 75 op/s
Jan 23 10:17:41 compute-2 nova_compute[225701]: 2026-01-23 10:17:41.731 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:41.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:42 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003c90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:42 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003c90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:42 compute-2 nova_compute[225701]: 2026-01-23 10:17:42.572 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:42 compute-2 podman[229954]: 2026-01-23 10:17:42.643818585 +0000 UTC m=+0.056272306 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 23 10:17:42 compute-2 podman[229953]: 2026-01-23 10:17:42.672530482 +0000 UTC m=+0.086028078 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 23 10:17:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:42.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:43 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:43 compute-2 sudo[230001]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:17:43 compute-2 sudo[230001]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:17:43 compute-2 sudo[230001]: pam_unix(sudo:session): session closed for user root
Jan 23 10:17:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:43 compute-2 ceph-mon[75771]: pgmap v733: 353 pgs: 353 active+clean; 197 MiB data, 344 MiB used, 60 GiB / 60 GiB avail; 2.1 MiB/s rd, 2.1 MiB/s wr, 132 op/s
Jan 23 10:17:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:43.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:44 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed00032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:44 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:17:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:44 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:44.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:45 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003c90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:45 compute-2 ceph-mon[75771]: pgmap v734: 353 pgs: 353 active+clean; 197 MiB data, 344 MiB used, 60 GiB / 60 GiB avail; 207 KiB/s rd, 2.1 MiB/s wr, 57 op/s
Jan 23 10:17:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:45.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:46 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:46 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed00032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:46 compute-2 nova_compute[225701]: 2026-01-23 10:17:46.732 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:46.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:47 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:47 compute-2 nova_compute[225701]: 2026-01-23 10:17:47.594 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:47 compute-2 ceph-mon[75771]: pgmap v735: 353 pgs: 353 active+clean; 200 MiB data, 344 MiB used, 60 GiB / 60 GiB avail; 257 KiB/s rd, 2.1 MiB/s wr, 62 op/s
Jan 23 10:17:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:17:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:47.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:17:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:48 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003c90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:48 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:48.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:48 compute-2 nova_compute[225701]: 2026-01-23 10:17:48.845 225706 DEBUG oslo_concurrency.lockutils [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "c07a4c22-2b21-4f01-9038-2a522a89b3b1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:17:48 compute-2 nova_compute[225701]: 2026-01-23 10:17:48.846 225706 DEBUG oslo_concurrency.lockutils [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "c07a4c22-2b21-4f01-9038-2a522a89b3b1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:17:48 compute-2 nova_compute[225701]: 2026-01-23 10:17:48.846 225706 DEBUG oslo_concurrency.lockutils [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "c07a4c22-2b21-4f01-9038-2a522a89b3b1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:17:48 compute-2 nova_compute[225701]: 2026-01-23 10:17:48.846 225706 DEBUG oslo_concurrency.lockutils [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "c07a4c22-2b21-4f01-9038-2a522a89b3b1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:17:48 compute-2 nova_compute[225701]: 2026-01-23 10:17:48.846 225706 DEBUG oslo_concurrency.lockutils [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "c07a4c22-2b21-4f01-9038-2a522a89b3b1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:17:48 compute-2 nova_compute[225701]: 2026-01-23 10:17:48.847 225706 INFO nova.compute.manager [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Terminating instance
Jan 23 10:17:48 compute-2 nova_compute[225701]: 2026-01-23 10:17:48.848 225706 DEBUG nova.compute.manager [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 10:17:48 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/3688632625' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:17:48 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/3688632625' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:17:48 compute-2 kernel: tap6d05af08-c0 (unregistering): left promiscuous mode
Jan 23 10:17:48 compute-2 NetworkManager[48964]: <info>  [1769163468.9045] device (tap6d05af08-c0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 10:17:48 compute-2 ovn_controller[132789]: 2026-01-23T10:17:48Z|00033|binding|INFO|Releasing lport 6d05af08-c0fc-42c9-a5aa-eea4c33255a9 from this chassis (sb_readonly=0)
Jan 23 10:17:48 compute-2 ovn_controller[132789]: 2026-01-23T10:17:48Z|00034|binding|INFO|Setting lport 6d05af08-c0fc-42c9-a5aa-eea4c33255a9 down in Southbound
Jan 23 10:17:48 compute-2 nova_compute[225701]: 2026-01-23 10:17:48.907 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:48 compute-2 ovn_controller[132789]: 2026-01-23T10:17:48Z|00035|binding|INFO|Removing iface tap6d05af08-c0 ovn-installed in OVS
Jan 23 10:17:48 compute-2 nova_compute[225701]: 2026-01-23 10:17:48.909 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:48 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:48.913 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:ea:43 10.100.0.22'], port_security=['fa:16:3e:5f:ea:43 10.100.0.22'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.22/28', 'neutron:device_id': 'c07a4c22-2b21-4f01-9038-2a522a89b3b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ea2c9056-8899-490c-806c-edc7669d1876', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fe078e1b-317f-4f89-baf9-dae03f7c432d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c711f457-cdcf-449c-8bb3-bfedc76a2abf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>], logical_port=6d05af08-c0fc-42c9-a5aa-eea4c33255a9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:17:48 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:48.915 142606 INFO neutron.agent.ovn.metadata.agent [-] Port 6d05af08-c0fc-42c9-a5aa-eea4c33255a9 in datapath ea2c9056-8899-490c-806c-edc7669d1876 unbound from our chassis
Jan 23 10:17:48 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:48.916 142606 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ea2c9056-8899-490c-806c-edc7669d1876, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 10:17:48 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:48.917 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[021813a6-89a2-472c-b393-12f44265f1ce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:17:48 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:48.918 142606 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ea2c9056-8899-490c-806c-edc7669d1876 namespace which is not needed anymore
Jan 23 10:17:48 compute-2 nova_compute[225701]: 2026-01-23 10:17:48.925 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:48 compute-2 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Jan 23 10:17:48 compute-2 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 13.151s CPU time.
Jan 23 10:17:48 compute-2 systemd-machined[194368]: Machine qemu-1-instance-00000002 terminated.
Jan 23 10:17:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:49 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec002030 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.071 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.075 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.092 225706 INFO nova.virt.libvirt.driver [-] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Instance destroyed successfully.
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.092 225706 DEBUG nova.objects.instance [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'resources' on Instance uuid c07a4c22-2b21-4f01-9038-2a522a89b3b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.103 225706 DEBUG nova.compute.manager [req-ff558875-1b22-436a-bd4b-89564a4bfe3b req-89da9a78-24b3-4873-81ca-54f3b491422a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Received event network-vif-unplugged-6d05af08-c0fc-42c9-a5aa-eea4c33255a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.104 225706 DEBUG oslo_concurrency.lockutils [req-ff558875-1b22-436a-bd4b-89564a4bfe3b req-89da9a78-24b3-4873-81ca-54f3b491422a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "c07a4c22-2b21-4f01-9038-2a522a89b3b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.104 225706 DEBUG oslo_concurrency.lockutils [req-ff558875-1b22-436a-bd4b-89564a4bfe3b req-89da9a78-24b3-4873-81ca-54f3b491422a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "c07a4c22-2b21-4f01-9038-2a522a89b3b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.105 225706 DEBUG oslo_concurrency.lockutils [req-ff558875-1b22-436a-bd4b-89564a4bfe3b req-89da9a78-24b3-4873-81ca-54f3b491422a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "c07a4c22-2b21-4f01-9038-2a522a89b3b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.105 225706 DEBUG nova.compute.manager [req-ff558875-1b22-436a-bd4b-89564a4bfe3b req-89da9a78-24b3-4873-81ca-54f3b491422a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] No waiting events found dispatching network-vif-unplugged-6d05af08-c0fc-42c9-a5aa-eea4c33255a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.105 225706 DEBUG nova.compute.manager [req-ff558875-1b22-436a-bd4b-89564a4bfe3b req-89da9a78-24b3-4873-81ca-54f3b491422a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Received event network-vif-unplugged-6d05af08-c0fc-42c9-a5aa-eea4c33255a9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.108 225706 DEBUG nova.virt.libvirt.vif [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:17:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1495306657',display_name='tempest-TestNetworkBasicOps-server-1495306657',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1495306657',id=2,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLxxDi55rJrmjd9tPx3gB/xixK1x2HRLnmG9Byb+stnwWJq6KflxlWAuyclQ+qi9LouKqZ2XGS+GdJbd7h/CxxTRs6viENTRpetaAiEPpt6tQjNJzUsbg7BLt/d0b2D9Pg==',key_name='tempest-TestNetworkBasicOps-1853917503',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:17:27Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-n0x2ufhy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:17:27Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=c07a4c22-2b21-4f01-9038-2a522a89b3b1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6d05af08-c0fc-42c9-a5aa-eea4c33255a9", "address": "fa:16:3e:5f:ea:43", "network": {"id": "ea2c9056-8899-490c-806c-edc7669d1876", "bridge": "br-int", "label": "tempest-network-smoke--139118822", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d05af08-c0", "ovs_interfaceid": "6d05af08-c0fc-42c9-a5aa-eea4c33255a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.109 225706 DEBUG nova.network.os_vif_util [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "6d05af08-c0fc-42c9-a5aa-eea4c33255a9", "address": "fa:16:3e:5f:ea:43", "network": {"id": "ea2c9056-8899-490c-806c-edc7669d1876", "bridge": "br-int", "label": "tempest-network-smoke--139118822", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d05af08-c0", "ovs_interfaceid": "6d05af08-c0fc-42c9-a5aa-eea4c33255a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.110 225706 DEBUG nova.network.os_vif_util [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:ea:43,bridge_name='br-int',has_traffic_filtering=True,id=6d05af08-c0fc-42c9-a5aa-eea4c33255a9,network=Network(ea2c9056-8899-490c-806c-edc7669d1876),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d05af08-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.110 225706 DEBUG os_vif [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:ea:43,bridge_name='br-int',has_traffic_filtering=True,id=6d05af08-c0fc-42c9-a5aa-eea4c33255a9,network=Network(ea2c9056-8899-490c-806c-edc7669d1876),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d05af08-c0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 10:17:49 compute-2 neutron-haproxy-ovnmeta-ea2c9056-8899-490c-806c-edc7669d1876[229922]: [NOTICE]   (229926) : haproxy version is 2.8.14-c23fe91
Jan 23 10:17:49 compute-2 neutron-haproxy-ovnmeta-ea2c9056-8899-490c-806c-edc7669d1876[229922]: [NOTICE]   (229926) : path to executable is /usr/sbin/haproxy
Jan 23 10:17:49 compute-2 neutron-haproxy-ovnmeta-ea2c9056-8899-490c-806c-edc7669d1876[229922]: [WARNING]  (229926) : Exiting Master process...
Jan 23 10:17:49 compute-2 neutron-haproxy-ovnmeta-ea2c9056-8899-490c-806c-edc7669d1876[229922]: [ALERT]    (229926) : Current worker (229928) exited with code 143 (Terminated)
Jan 23 10:17:49 compute-2 neutron-haproxy-ovnmeta-ea2c9056-8899-490c-806c-edc7669d1876[229922]: [WARNING]  (229926) : All workers exited. Exiting... (0)
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.114 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.114 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6d05af08-c0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.116 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.118 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:49 compute-2 systemd[1]: libpod-3105a5232b365273d9c35f6e56a1fd2e8510d5a68b643b81eccb01faa11ebfe3.scope: Deactivated successfully.
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.121 225706 INFO os_vif [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:ea:43,bridge_name='br-int',has_traffic_filtering=True,id=6d05af08-c0fc-42c9-a5aa-eea4c33255a9,network=Network(ea2c9056-8899-490c-806c-edc7669d1876),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d05af08-c0')
Jan 23 10:17:49 compute-2 podman[230058]: 2026-01-23 10:17:49.123674234 +0000 UTC m=+0.101002630 container died 3105a5232b365273d9c35f6e56a1fd2e8510d5a68b643b81eccb01faa11ebfe3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-ea2c9056-8899-490c-806c-edc7669d1876, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 10:17:49 compute-2 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3105a5232b365273d9c35f6e56a1fd2e8510d5a68b643b81eccb01faa11ebfe3-userdata-shm.mount: Deactivated successfully.
Jan 23 10:17:49 compute-2 systemd[1]: var-lib-containers-storage-overlay-bf7888edb03887a6ed0797d837078ec13d5e4829a7c47eb6c5b9736fe7a2eb5f-merged.mount: Deactivated successfully.
Jan 23 10:17:49 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:17:49 compute-2 podman[230058]: 2026-01-23 10:17:49.33828683 +0000 UTC m=+0.315615216 container cleanup 3105a5232b365273d9c35f6e56a1fd2e8510d5a68b643b81eccb01faa11ebfe3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-ea2c9056-8899-490c-806c-edc7669d1876, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 10:17:49 compute-2 systemd[1]: libpod-conmon-3105a5232b365273d9c35f6e56a1fd2e8510d5a68b643b81eccb01faa11ebfe3.scope: Deactivated successfully.
Jan 23 10:17:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:49 compute-2 podman[230113]: 2026-01-23 10:17:49.490133271 +0000 UTC m=+0.126552169 container remove 3105a5232b365273d9c35f6e56a1fd2e8510d5a68b643b81eccb01faa11ebfe3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-ea2c9056-8899-490c-806c-edc7669d1876, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:17:49 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:49.497 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[e4acc5cc-a5d7-4e4e-b0f6-e1a7e90d37e3]: (4, ('Fri Jan 23 10:17:48 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ea2c9056-8899-490c-806c-edc7669d1876 (3105a5232b365273d9c35f6e56a1fd2e8510d5a68b643b81eccb01faa11ebfe3)\n3105a5232b365273d9c35f6e56a1fd2e8510d5a68b643b81eccb01faa11ebfe3\nFri Jan 23 10:17:49 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ea2c9056-8899-490c-806c-edc7669d1876 (3105a5232b365273d9c35f6e56a1fd2e8510d5a68b643b81eccb01faa11ebfe3)\n3105a5232b365273d9c35f6e56a1fd2e8510d5a68b643b81eccb01faa11ebfe3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:17:49 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:49.499 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[3973abc1-6751-426b-86cc-d5f4d4ab0f5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:17:49 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:49.500 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea2c9056-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.502 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:49 compute-2 kernel: tapea2c9056-80: left promiscuous mode
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.529 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:49 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:49.532 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[45797e6c-efd8-40b5-8309-033a3db1cb6a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:17:49 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:49.550 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[0f8a4902-372a-4744-a838-e9ac6d9a5f2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:17:49 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:49.551 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[2b2648c5-e327-409f-a16b-0aca288228ca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:17:49 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:49.565 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[52958df3-0fb3-40b8-a1bc-22ee9d5bf8cc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 460348, 'reachable_time': 28644, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230129, 'error': None, 'target': 'ovnmeta-ea2c9056-8899-490c-806c-edc7669d1876', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:17:49 compute-2 systemd[1]: run-netns-ovnmeta\x2dea2c9056\x2d8899\x2d490c\x2d806c\x2dedc7669d1876.mount: Deactivated successfully.
Jan 23 10:17:49 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:49.578 142723 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ea2c9056-8899-490c-806c-edc7669d1876 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 10:17:49 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:49.580 142723 DEBUG oslo.privsep.daemon [-] privsep: reply[338e1ce3-1d7e-4259-8996-74a7181edf8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.760 225706 INFO nova.virt.libvirt.driver [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Deleting instance files /var/lib/nova/instances/c07a4c22-2b21-4f01-9038-2a522a89b3b1_del
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.761 225706 INFO nova.virt.libvirt.driver [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Deletion of /var/lib/nova/instances/c07a4c22-2b21-4f01-9038-2a522a89b3b1_del complete
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.836 225706 DEBUG nova.virt.libvirt.host [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.837 225706 INFO nova.virt.libvirt.host [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] UEFI support detected
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.840 225706 INFO nova.compute.manager [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Took 0.99 seconds to destroy the instance on the hypervisor.
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.840 225706 DEBUG oslo.service.loopingcall [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.841 225706 DEBUG nova.compute.manager [-] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 10:17:49 compute-2 nova_compute[225701]: 2026-01-23 10:17:49.842 225706 DEBUG nova.network.neutron [-] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 10:17:49 compute-2 ceph-mon[75771]: pgmap v736: 353 pgs: 353 active+clean; 200 MiB data, 344 MiB used, 60 GiB / 60 GiB avail; 257 KiB/s rd, 2.1 MiB/s wr, 62 op/s
Jan 23 10:17:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:49.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:50 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:50 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003c90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:50 compute-2 nova_compute[225701]: 2026-01-23 10:17:50.576 225706 DEBUG nova.network.neutron [-] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:17:50 compute-2 nova_compute[225701]: 2026-01-23 10:17:50.735 225706 DEBUG nova.compute.manager [req-e68f54b3-0547-40a3-88a8-1527fca98f76 req-7a783ce5-dd5f-4e04-8523-1e2423537047 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Received event network-vif-deleted-6d05af08-c0fc-42c9-a5aa-eea4c33255a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:17:50 compute-2 nova_compute[225701]: 2026-01-23 10:17:50.736 225706 INFO nova.compute.manager [req-e68f54b3-0547-40a3-88a8-1527fca98f76 req-7a783ce5-dd5f-4e04-8523-1e2423537047 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Neutron deleted interface 6d05af08-c0fc-42c9-a5aa-eea4c33255a9; detaching it from the instance and deleting it from the info cache
Jan 23 10:17:50 compute-2 nova_compute[225701]: 2026-01-23 10:17:50.736 225706 DEBUG nova.network.neutron [req-e68f54b3-0547-40a3-88a8-1527fca98f76 req-7a783ce5-dd5f-4e04-8523-1e2423537047 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:17:50 compute-2 nova_compute[225701]: 2026-01-23 10:17:50.749 225706 INFO nova.compute.manager [-] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Took 0.91 seconds to deallocate network for instance.
Jan 23 10:17:50 compute-2 nova_compute[225701]: 2026-01-23 10:17:50.756 225706 DEBUG nova.compute.manager [req-e68f54b3-0547-40a3-88a8-1527fca98f76 req-7a783ce5-dd5f-4e04-8523-1e2423537047 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Detach interface failed, port_id=6d05af08-c0fc-42c9-a5aa-eea4c33255a9, reason: Instance c07a4c22-2b21-4f01-9038-2a522a89b3b1 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 23 10:17:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:50.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:17:50 compute-2 ceph-mon[75771]: pgmap v737: 353 pgs: 353 active+clean; 200 MiB data, 344 MiB used, 60 GiB / 60 GiB avail; 257 KiB/s rd, 2.1 MiB/s wr, 62 op/s
Jan 23 10:17:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:51 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:51 compute-2 nova_compute[225701]: 2026-01-23 10:17:51.047 225706 DEBUG oslo_concurrency.lockutils [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:17:51 compute-2 nova_compute[225701]: 2026-01-23 10:17:51.047 225706 DEBUG oslo_concurrency.lockutils [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:17:51 compute-2 nova_compute[225701]: 2026-01-23 10:17:51.131 225706 DEBUG oslo_concurrency.processutils [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:17:51 compute-2 nova_compute[225701]: 2026-01-23 10:17:51.191 225706 DEBUG nova.compute.manager [req-b90d83e1-c23c-40b7-af30-e01b68d4c234 req-5a73140a-990b-4e15-8d84-5461c10eca3c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Received event network-vif-plugged-6d05af08-c0fc-42c9-a5aa-eea4c33255a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:17:51 compute-2 nova_compute[225701]: 2026-01-23 10:17:51.193 225706 DEBUG oslo_concurrency.lockutils [req-b90d83e1-c23c-40b7-af30-e01b68d4c234 req-5a73140a-990b-4e15-8d84-5461c10eca3c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "c07a4c22-2b21-4f01-9038-2a522a89b3b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:17:51 compute-2 nova_compute[225701]: 2026-01-23 10:17:51.193 225706 DEBUG oslo_concurrency.lockutils [req-b90d83e1-c23c-40b7-af30-e01b68d4c234 req-5a73140a-990b-4e15-8d84-5461c10eca3c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "c07a4c22-2b21-4f01-9038-2a522a89b3b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:17:51 compute-2 nova_compute[225701]: 2026-01-23 10:17:51.194 225706 DEBUG oslo_concurrency.lockutils [req-b90d83e1-c23c-40b7-af30-e01b68d4c234 req-5a73140a-990b-4e15-8d84-5461c10eca3c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "c07a4c22-2b21-4f01-9038-2a522a89b3b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:17:51 compute-2 nova_compute[225701]: 2026-01-23 10:17:51.195 225706 DEBUG nova.compute.manager [req-b90d83e1-c23c-40b7-af30-e01b68d4c234 req-5a73140a-990b-4e15-8d84-5461c10eca3c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] No waiting events found dispatching network-vif-plugged-6d05af08-c0fc-42c9-a5aa-eea4c33255a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:17:51 compute-2 nova_compute[225701]: 2026-01-23 10:17:51.195 225706 WARNING nova.compute.manager [req-b90d83e1-c23c-40b7-af30-e01b68d4c234 req-5a73140a-990b-4e15-8d84-5461c10eca3c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Received unexpected event network-vif-plugged-6d05af08-c0fc-42c9-a5aa-eea4c33255a9 for instance with vm_state deleted and task_state None.
Jan 23 10:17:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:51 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:17:51 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1982629231' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:51 compute-2 nova_compute[225701]: 2026-01-23 10:17:51.632 225706 DEBUG oslo_concurrency.processutils [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:17:51 compute-2 nova_compute[225701]: 2026-01-23 10:17:51.639 225706 DEBUG nova.compute.provider_tree [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Updating inventory in ProviderTree for provider db762d15-510c-4120-bfc4-afe76b90b657 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 10:17:51 compute-2 nova_compute[225701]: 2026-01-23 10:17:51.676 225706 ERROR nova.scheduler.client.report [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [req-a1ce6567-c9a0-4b71-8be7-d877e1a5f796] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID db762d15-510c-4120-bfc4-afe76b90b657.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-a1ce6567-c9a0-4b71-8be7-d877e1a5f796"}]}
Jan 23 10:17:51 compute-2 nova_compute[225701]: 2026-01-23 10:17:51.689 225706 DEBUG nova.scheduler.client.report [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Refreshing inventories for resource provider db762d15-510c-4120-bfc4-afe76b90b657 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 23 10:17:51 compute-2 nova_compute[225701]: 2026-01-23 10:17:51.708 225706 DEBUG nova.scheduler.client.report [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Updating ProviderTree inventory for provider db762d15-510c-4120-bfc4-afe76b90b657 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 23 10:17:51 compute-2 nova_compute[225701]: 2026-01-23 10:17:51.708 225706 DEBUG nova.compute.provider_tree [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Updating inventory in ProviderTree for provider db762d15-510c-4120-bfc4-afe76b90b657 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 10:17:51 compute-2 nova_compute[225701]: 2026-01-23 10:17:51.725 225706 DEBUG nova.scheduler.client.report [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Refreshing aggregate associations for resource provider db762d15-510c-4120-bfc4-afe76b90b657, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 23 10:17:51 compute-2 nova_compute[225701]: 2026-01-23 10:17:51.747 225706 DEBUG nova.scheduler.client.report [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Refreshing trait associations for resource provider db762d15-510c-4120-bfc4-afe76b90b657, traits: COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX2,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_F16C,HW_CPU_X86_ABM,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_BMI2,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SVM,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_FDC,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_CLMUL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSSE3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 23 10:17:51 compute-2 nova_compute[225701]: 2026-01-23 10:17:51.776 225706 DEBUG oslo_concurrency.processutils [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:17:51 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1982629231' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:17:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:51.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:17:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:52 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec002030 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:52 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:17:52 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1712693474' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:52 compute-2 nova_compute[225701]: 2026-01-23 10:17:52.254 225706 DEBUG oslo_concurrency.processutils [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:17:52 compute-2 nova_compute[225701]: 2026-01-23 10:17:52.261 225706 DEBUG nova.compute.provider_tree [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Updating inventory in ProviderTree for provider db762d15-510c-4120-bfc4-afe76b90b657 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 10:17:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:52 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:52 compute-2 nova_compute[225701]: 2026-01-23 10:17:52.313 225706 DEBUG nova.scheduler.client.report [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Updated inventory for provider db762d15-510c-4120-bfc4-afe76b90b657 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Jan 23 10:17:52 compute-2 nova_compute[225701]: 2026-01-23 10:17:52.314 225706 DEBUG nova.compute.provider_tree [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Updating resource provider db762d15-510c-4120-bfc4-afe76b90b657 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 23 10:17:52 compute-2 nova_compute[225701]: 2026-01-23 10:17:52.314 225706 DEBUG nova.compute.provider_tree [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Updating inventory in ProviderTree for provider db762d15-510c-4120-bfc4-afe76b90b657 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 10:17:52 compute-2 nova_compute[225701]: 2026-01-23 10:17:52.340 225706 DEBUG oslo_concurrency.lockutils [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:17:52 compute-2 nova_compute[225701]: 2026-01-23 10:17:52.364 225706 INFO nova.scheduler.client.report [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Deleted allocations for instance c07a4c22-2b21-4f01-9038-2a522a89b3b1
Jan 23 10:17:52 compute-2 nova_compute[225701]: 2026-01-23 10:17:52.437 225706 DEBUG oslo_concurrency.lockutils [None req-c1382e16-c44f-43e5-8dbb-d514cf5527c0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "c07a4c22-2b21-4f01-9038-2a522a89b3b1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.592s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:17:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:52 compute-2 nova_compute[225701]: 2026-01-23 10:17:52.597 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:17:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:52.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:17:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:53 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003c90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:53 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1712693474' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:53 compute-2 ceph-mon[75771]: pgmap v738: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 276 KiB/s rd, 2.1 MiB/s wr, 90 op/s
Jan 23 10:17:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:53.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:54 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:54 compute-2 nova_compute[225701]: 2026-01-23 10:17:54.118 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:54 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:17:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:54 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec002030 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:17:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:54.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:17:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:55 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:55.482 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:17:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:55.483 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:17:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:17:55.483 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:17:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:55 compute-2 ceph-mon[75771]: pgmap v739: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 69 KiB/s rd, 18 KiB/s wr, 33 op/s
Jan 23 10:17:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:55.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:56 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003c90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:56 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:56.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:57 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec003130 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:57 compute-2 ceph-mon[75771]: pgmap v740: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 69 KiB/s rd, 19 KiB/s wr, 33 op/s
Jan 23 10:17:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:57 compute-2 nova_compute[225701]: 2026-01-23 10:17:57.734 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:57 compute-2 nova_compute[225701]: 2026-01-23 10:17:57.766 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:57 compute-2 nova_compute[225701]: 2026-01-23 10:17:57.869 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:57.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:58 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:58 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003c90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:58 compute-2 nova_compute[225701]: 2026-01-23 10:17:58.779 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:17:58 compute-2 nova_compute[225701]: 2026-01-23 10:17:58.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:17:58 compute-2 nova_compute[225701]: 2026-01-23 10:17:58.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:17:58 compute-2 nova_compute[225701]: 2026-01-23 10:17:58.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:17:58 compute-2 nova_compute[225701]: 2026-01-23 10:17:58.801 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:17:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:58.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:17:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:17:59 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003c90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:59 compute-2 nova_compute[225701]: 2026-01-23 10:17:59.121 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:17:59 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:17:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:17:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:17:59 compute-2 ceph-mon[75771]: pgmap v741: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 14 KiB/s wr, 28 op/s
Jan 23 10:17:59 compute-2 nova_compute[225701]: 2026-01-23 10:17:59.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:17:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:17:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:59.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:00 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec003130 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:00 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:00 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2675935538' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:18:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:00.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:01 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:01 compute-2 ceph-mon[75771]: pgmap v742: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 14 KiB/s wr, 28 op/s
Jan 23 10:18:01 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2538524120' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:18:01 compute-2 nova_compute[225701]: 2026-01-23 10:18:01.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:18:01 compute-2 nova_compute[225701]: 2026-01-23 10:18:01.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:18:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:18:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:01.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:18:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:02 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003c90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:02 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:02 compute-2 nova_compute[225701]: 2026-01-23 10:18:02.769 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:02 compute-2 nova_compute[225701]: 2026-01-23 10:18:02.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:18:02 compute-2 nova_compute[225701]: 2026-01-23 10:18:02.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:18:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:02.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:02 compute-2 nova_compute[225701]: 2026-01-23 10:18:02.831 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:18:02 compute-2 nova_compute[225701]: 2026-01-23 10:18:02.831 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:18:02 compute-2 nova_compute[225701]: 2026-01-23 10:18:02.831 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:18:02 compute-2 nova_compute[225701]: 2026-01-23 10:18:02.832 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:18:02 compute-2 nova_compute[225701]: 2026-01-23 10:18:02.832 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:18:02 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1665070625' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:18:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:03 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:03 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:18:03 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2045104684' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:18:03 compute-2 nova_compute[225701]: 2026-01-23 10:18:03.278 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:18:03 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:18:03.419 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:18:03 compute-2 nova_compute[225701]: 2026-01-23 10:18:03.420 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:03 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:18:03.420 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:18:03 compute-2 nova_compute[225701]: 2026-01-23 10:18:03.456 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:18:03 compute-2 nova_compute[225701]: 2026-01-23 10:18:03.457 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4927MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:18:03 compute-2 nova_compute[225701]: 2026-01-23 10:18:03.458 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:18:03 compute-2 nova_compute[225701]: 2026-01-23 10:18:03.458 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:18:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:03 compute-2 nova_compute[225701]: 2026-01-23 10:18:03.514 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:18:03 compute-2 nova_compute[225701]: 2026-01-23 10:18:03.515 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:18:03 compute-2 sudo[230213]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:18:03 compute-2 sudo[230213]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:18:03 compute-2 nova_compute[225701]: 2026-01-23 10:18:03.537 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:18:03 compute-2 sudo[230213]: pam_unix(sudo:session): session closed for user root
Jan 23 10:18:03 compute-2 ceph-mon[75771]: pgmap v743: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 38 KiB/s rd, 15 KiB/s wr, 56 op/s
Jan 23 10:18:03 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2045104684' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:18:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:03.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:04 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:04 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:18:04 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3410514855' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:18:04 compute-2 nova_compute[225701]: 2026-01-23 10:18:04.065 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.529s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:18:04 compute-2 nova_compute[225701]: 2026-01-23 10:18:04.071 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:18:04 compute-2 nova_compute[225701]: 2026-01-23 10:18:04.086 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:18:04 compute-2 nova_compute[225701]: 2026-01-23 10:18:04.090 225706 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163469.0894485, c07a4c22-2b21-4f01-9038-2a522a89b3b1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:18:04 compute-2 nova_compute[225701]: 2026-01-23 10:18:04.091 225706 INFO nova.compute.manager [-] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] VM Stopped (Lifecycle Event)
Jan 23 10:18:04 compute-2 nova_compute[225701]: 2026-01-23 10:18:04.112 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:18:04 compute-2 nova_compute[225701]: 2026-01-23 10:18:04.113 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.655s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:18:04 compute-2 nova_compute[225701]: 2026-01-23 10:18:04.115 225706 DEBUG nova.compute.manager [None req-58b007ab-cafe-481a-a987-404c5524594f - - - - - -] [instance: c07a4c22-2b21-4f01-9038-2a522a89b3b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:18:04 compute-2 nova_compute[225701]: 2026-01-23 10:18:04.123 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:04 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:18:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:04 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003e30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:04.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:04 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3410514855' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:18:04 compute-2 ceph-mon[75771]: pgmap v744: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 2.2 KiB/s wr, 28 op/s
Jan 23 10:18:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:05 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:05 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:18:05 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4166887387' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:18:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:18:05 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/4166887387' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:18:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:18:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:05.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:18:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:06 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:06 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:06.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:06 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/4138145315' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:18:06 compute-2 ceph-mon[75771]: pgmap v745: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 2.2 KiB/s wr, 28 op/s
Jan 23 10:18:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:07 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:07 compute-2 nova_compute[225701]: 2026-01-23 10:18:07.114 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:18:07 compute-2 nova_compute[225701]: 2026-01-23 10:18:07.115 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:18:07 compute-2 nova_compute[225701]: 2026-01-23 10:18:07.115 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:18:07 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:18:07.423 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:18:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/101807 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:18:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:07 compute-2 nova_compute[225701]: 2026-01-23 10:18:07.772 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:08.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:08 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:08 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:18:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:08.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:18:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:09 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:09 compute-2 nova_compute[225701]: 2026-01-23 10:18:09.126 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:09 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:18:09 compute-2 sudo[230266]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:18:09 compute-2 sudo[230266]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:18:09 compute-2 sudo[230266]: pam_unix(sudo:session): session closed for user root
Jan 23 10:18:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:09 compute-2 sudo[230291]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:18:09 compute-2 sudo[230291]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:18:09 compute-2 ceph-mon[75771]: pgmap v746: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Jan 23 10:18:09 compute-2 sudo[230291]: pam_unix(sudo:session): session closed for user root
Jan 23 10:18:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:18:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:10.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:18:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:10 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:10 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:10 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:18:10 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:18:10 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:18:10 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:18:10 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:18:10 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:18:10 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:18:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:10.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:11 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:18:11.639670) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163491639904, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 1207, "num_deletes": 251, "total_data_size": 2842174, "memory_usage": 2880448, "flush_reason": "Manual Compaction"}
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163491658437, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 1850362, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24742, "largest_seqno": 25944, "table_properties": {"data_size": 1845137, "index_size": 2685, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11658, "raw_average_key_size": 19, "raw_value_size": 1834496, "raw_average_value_size": 3135, "num_data_blocks": 120, "num_entries": 585, "num_filter_entries": 585, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163397, "oldest_key_time": 1769163397, "file_creation_time": 1769163491, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 18785 microseconds, and 5585 cpu microseconds.
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:18:11.658539) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 1850362 bytes OK
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:18:11.658571) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:18:11.660502) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:18:11.660525) EVENT_LOG_v1 {"time_micros": 1769163491660521, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:18:11.660549) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 2836354, prev total WAL file size 2836354, number of live WAL files 2.
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:18:11.661445) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(1806KB)], [48(12MB)]
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163491661601, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 14993034, "oldest_snapshot_seqno": -1}
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 5451 keys, 12824286 bytes, temperature: kUnknown
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163491756048, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 12824286, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12787832, "index_size": 21752, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13637, "raw_key_size": 139960, "raw_average_key_size": 25, "raw_value_size": 12688758, "raw_average_value_size": 2327, "num_data_blocks": 884, "num_entries": 5451, "num_filter_entries": 5451, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769163491, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:18:11.756304) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 12824286 bytes
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:18:11.758333) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 158.6 rd, 135.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.8, 12.5 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(15.0) write-amplify(6.9) OK, records in: 5967, records dropped: 516 output_compression: NoCompression
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:18:11.758353) EVENT_LOG_v1 {"time_micros": 1769163491758344, "job": 28, "event": "compaction_finished", "compaction_time_micros": 94515, "compaction_time_cpu_micros": 28801, "output_level": 6, "num_output_files": 1, "total_output_size": 12824286, "num_input_records": 5967, "num_output_records": 5451, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163491759352, "job": 28, "event": "table_file_deletion", "file_number": 50}
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163491761399, "job": 28, "event": "table_file_deletion", "file_number": 48}
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:18:11.661281) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:18:11.761428) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:18:11.761431) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:18:11.761433) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:18:11.761434) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:18:11 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:18:11.761436) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:18:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:12.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:12 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:12 compute-2 ceph-mon[75771]: pgmap v747: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Jan 23 10:18:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:12 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004b50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:12 compute-2 nova_compute[225701]: 2026-01-23 10:18:12.775 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:12.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:13 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:13 compute-2 ceph-mon[75771]: pgmap v748: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Jan 23 10:18:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:13 compute-2 podman[230351]: 2026-01-23 10:18:13.661642721 +0000 UTC m=+0.076219640 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 10:18:13 compute-2 podman[230350]: 2026-01-23 10:18:13.694458376 +0000 UTC m=+0.114485857 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 23 10:18:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:18:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:14.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:18:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:14 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:14 compute-2 nova_compute[225701]: 2026-01-23 10:18:14.128 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:14 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:18:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:14 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:18:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:14.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:18:14 compute-2 ceph-mon[75771]: pgmap v749: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:18:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:15 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004b50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:18:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:16.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:18:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:16 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:16 compute-2 sudo[230400]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:18:16 compute-2 sudo[230400]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:18:16 compute-2 sudo[230400]: pam_unix(sudo:session): session closed for user root
Jan 23 10:18:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:16 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:16 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:18:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:16.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:16 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:18:16 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:18:16 compute-2 ceph-mon[75771]: pgmap v750: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:18:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:17 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:17 compute-2 nova_compute[225701]: 2026-01-23 10:18:17.777 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:18.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:18 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004b50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:18 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:18.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:19 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:19 compute-2 nova_compute[225701]: 2026-01-23 10:18:19.130 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:19 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:18:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:19 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:18:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:19 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:18:19 compute-2 ceph-mon[75771]: pgmap v751: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:18:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:19 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:18:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:20.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:20 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:20 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:18:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:20.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:21 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:22.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:22 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:22 compute-2 ceph-mon[75771]: pgmap v752: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:18:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:22 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:22 compute-2 nova_compute[225701]: 2026-01-23 10:18:22.781 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:22.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:23 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003c30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:23 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:18:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:23 compute-2 sudo[230435]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:18:23 compute-2 sudo[230435]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:18:23 compute-2 sudo[230435]: pam_unix(sudo:session): session closed for user root
Jan 23 10:18:23 compute-2 ceph-mon[75771]: pgmap v753: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 2.8 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:18:23 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/453783286' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:18:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:24.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:24 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:24 compute-2 nova_compute[225701]: 2026-01-23 10:18:24.133 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:24 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc003ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:24 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:18:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:24.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:25 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:25 compute-2 ceph-mon[75771]: pgmap v754: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:18:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:18:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:26.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:18:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:26 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:26 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:26.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:27 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:27 compute-2 ceph-mon[75771]: pgmap v755: 353 pgs: 353 active+clean; 88 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 28 op/s
Jan 23 10:18:27 compute-2 nova_compute[225701]: 2026-01-23 10:18:27.783 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:28.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:28 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:28 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003c70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:28 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3880301074' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:18:28 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1407883863' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:18:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:28.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:29 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003c70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:29 compute-2 nova_compute[225701]: 2026-01-23 10:18:29.137 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/101829 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:18:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:29 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:18:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.005000118s ======
Jan 23 10:18:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:30.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.005000118s
Jan 23 10:18:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:30 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004030 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:30 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:30 compute-2 ceph-mon[75771]: pgmap v756: 353 pgs: 353 active+clean; 88 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:18:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:18:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:30.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:18:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:31 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:31 compute-2 ceph-mon[75771]: pgmap v757: 353 pgs: 353 active+clean; 88 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:18:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:18:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:32.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:18:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:32 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:32 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:32 compute-2 nova_compute[225701]: 2026-01-23 10:18:32.816 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:18:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:32.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:18:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:33 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:33 compute-2 ceph-mon[75771]: pgmap v758: 353 pgs: 353 active+clean; 88 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 42 op/s
Jan 23 10:18:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:18:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:34.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:18:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:34 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:34 compute-2 nova_compute[225701]: 2026-01-23 10:18:34.139 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:34 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:34 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:18:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:34.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:35 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:35 compute-2 ceph-mon[75771]: pgmap v759: 353 pgs: 353 active+clean; 88 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 39 op/s
Jan 23 10:18:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:18:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:36.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:36 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:36 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:18:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:36.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:18:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:37 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:37 compute-2 nova_compute[225701]: 2026-01-23 10:18:37.819 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:38.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:38 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:38 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:38 compute-2 ceph-mon[75771]: pgmap v760: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 3.6 MiB/s rd, 1.8 MiB/s wr, 108 op/s
Jan 23 10:18:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:18:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:38.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:18:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:39 compute-2 nova_compute[225701]: 2026-01-23 10:18:39.142 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:39 compute-2 ceph-mon[75771]: pgmap v761: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 83 op/s
Jan 23 10:18:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:39 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:18:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:40.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:40 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:40 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc0040b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:40.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:41 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:41 compute-2 ceph-mon[75771]: pgmap v762: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 83 op/s
Jan 23 10:18:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:42.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:42 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:42 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:42 compute-2 nova_compute[225701]: 2026-01-23 10:18:42.822 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:42.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:43 compute-2 ceph-mon[75771]: pgmap v763: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 83 op/s
Jan 23 10:18:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:43 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc0040d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:43 compute-2 ovn_controller[132789]: 2026-01-23T10:18:43Z|00036|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Jan 23 10:18:43 compute-2 sudo[230482]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:18:43 compute-2 sudo[230482]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:18:43 compute-2 sudo[230482]: pam_unix(sudo:session): session closed for user root
Jan 23 10:18:43 compute-2 podman[230506]: 2026-01-23 10:18:43.79760195 +0000 UTC m=+0.069064476 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 10:18:43 compute-2 podman[230507]: 2026-01-23 10:18:43.819976402 +0000 UTC m=+0.089523632 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 10:18:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:44.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:44 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:44 compute-2 nova_compute[225701]: 2026-01-23 10:18:44.144 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:44 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:44 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:18:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:18:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:44.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:18:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:45 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:45 compute-2 ceph-mon[75771]: pgmap v764: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 68 op/s
Jan 23 10:18:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:46 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc0040f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:46.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:46 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:46.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:47 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/101847 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:18:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:47 compute-2 nova_compute[225701]: 2026-01-23 10:18:47.867 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:47 compute-2 ceph-mon[75771]: pgmap v765: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 2.1 MiB/s wr, 134 op/s
Jan 23 10:18:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:48 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:48.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:48 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004110 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:18:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:48.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:18:48 compute-2 ceph-mon[75771]: pgmap v766: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 388 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Jan 23 10:18:48 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/2862770348' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:18:48 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/2862770348' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:18:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:49 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:49 compute-2 nova_compute[225701]: 2026-01-23 10:18:49.183 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:49 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:18:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:50 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:50.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:18:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:50 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec8001290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:18:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:50.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:18:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:51 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004110 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:51 compute-2 ceph-mon[75771]: pgmap v767: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 388 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Jan 23 10:18:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:52 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:52.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:52 compute-2 rsyslogd[1004]: imjournal: 5028 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 23 10:18:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:52 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:52 compute-2 nova_compute[225701]: 2026-01-23 10:18:52.869 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:18:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:52.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:18:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:53 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec8001290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:53 compute-2 ceph-mon[75771]: pgmap v768: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 388 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Jan 23 10:18:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:54 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004130 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:54.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:54 compute-2 nova_compute[225701]: 2026-01-23 10:18:54.185 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:54 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:54 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:18:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:54.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:55 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:18:55.484 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:18:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:18:55.484 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:18:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:18:55.484 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:18:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:55 compute-2 ceph-mon[75771]: pgmap v769: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 388 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Jan 23 10:18:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:56 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec8001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:56.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:56 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004150 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:56 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:18:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:18:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:56.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:18:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:57 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:57 compute-2 ceph-mon[75771]: pgmap v770: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 389 KiB/s rd, 2.1 MiB/s wr, 67 op/s
Jan 23 10:18:57 compute-2 nova_compute[225701]: 2026-01-23 10:18:57.869 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:58 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:18:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:58.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:18:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:58 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec8001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:58 compute-2 nova_compute[225701]: 2026-01-23 10:18:58.779 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:18:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:18:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:18:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:58.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:18:58 compute-2 ceph-mon[75771]: pgmap v771: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 12 KiB/s wr, 1 op/s
Jan 23 10:18:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:59 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004170 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:59 compute-2 nova_compute[225701]: 2026-01-23 10:18:59.189 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:18:59 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:18:59 compute-2 nova_compute[225701]: 2026-01-23 10:18:59.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:18:59 compute-2 nova_compute[225701]: 2026-01-23 10:18:59.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:18:59 compute-2 nova_compute[225701]: 2026-01-23 10:18:59.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:18:59 compute-2 nova_compute[225701]: 2026-01-23 10:18:59.813 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:18:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:59 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:18:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:59 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:19:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:00 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:00.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:00 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1911174770' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:19:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:00 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:00.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:01 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec8002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:01 compute-2 ceph-mon[75771]: pgmap v772: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 12 KiB/s wr, 1 op/s
Jan 23 10:19:01 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1756463740' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:19:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:01 compute-2 nova_compute[225701]: 2026-01-23 10:19:01.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:19:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:02 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004190 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:02.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:02 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:02 compute-2 nova_compute[225701]: 2026-01-23 10:19:02.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:19:02 compute-2 nova_compute[225701]: 2026-01-23 10:19:02.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:19:02 compute-2 nova_compute[225701]: 2026-01-23 10:19:02.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:19:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:02.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:02 compute-2 nova_compute[225701]: 2026-01-23 10:19:02.918 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:02 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:19:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:03 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:03 compute-2 ceph-mon[75771]: pgmap v773: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 3.0 KiB/s rd, 15 KiB/s wr, 4 op/s
Jan 23 10:19:03 compute-2 nova_compute[225701]: 2026-01-23 10:19:03.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:19:03 compute-2 sudo[230574]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:19:03 compute-2 sudo[230574]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:19:03 compute-2 sudo[230574]: pam_unix(sudo:session): session closed for user root
Jan 23 10:19:03 compute-2 nova_compute[225701]: 2026-01-23 10:19:03.827 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:19:03 compute-2 nova_compute[225701]: 2026-01-23 10:19:03.827 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:19:03 compute-2 nova_compute[225701]: 2026-01-23 10:19:03.827 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:19:03 compute-2 nova_compute[225701]: 2026-01-23 10:19:03.828 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:19:03 compute-2 nova_compute[225701]: 2026-01-23 10:19:03.828 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:19:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:04 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec8002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:04.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:04 compute-2 nova_compute[225701]: 2026-01-23 10:19:04.234 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:04 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:19:04 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2031067823' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:19:04 compute-2 nova_compute[225701]: 2026-01-23 10:19:04.336 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:19:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:04 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc0041b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:04 compute-2 nova_compute[225701]: 2026-01-23 10:19:04.547 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:19:04 compute-2 nova_compute[225701]: 2026-01-23 10:19:04.549 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4919MB free_disk=59.942726135253906GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:19:04 compute-2 nova_compute[225701]: 2026-01-23 10:19:04.549 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:19:04 compute-2 nova_compute[225701]: 2026-01-23 10:19:04.550 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:19:04 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:19:04 compute-2 nova_compute[225701]: 2026-01-23 10:19:04.620 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:19:04 compute-2 nova_compute[225701]: 2026-01-23 10:19:04.621 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:19:04 compute-2 nova_compute[225701]: 2026-01-23 10:19:04.646 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:19:04 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2031067823' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:19:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:04.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:05 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:19:05 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4045296728' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:19:05 compute-2 nova_compute[225701]: 2026-01-23 10:19:05.082 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:19:05 compute-2 nova_compute[225701]: 2026-01-23 10:19:05.088 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:19:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:05 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:05 compute-2 nova_compute[225701]: 2026-01-23 10:19:05.122 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:19:05 compute-2 nova_compute[225701]: 2026-01-23 10:19:05.123 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:19:05 compute-2 nova_compute[225701]: 2026-01-23 10:19:05.124 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:19:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:05 compute-2 ceph-mon[75771]: pgmap v774: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 3.0 KiB/s rd, 15 KiB/s wr, 4 op/s
Jan 23 10:19:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:19:05 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/4045296728' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:19:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:06 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:06.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:06 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:06 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1043744357' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:19:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:06.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:07 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc0041d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:07 compute-2 nova_compute[225701]: 2026-01-23 10:19:07.119 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:19:07 compute-2 nova_compute[225701]: 2026-01-23 10:19:07.151 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:19:07 compute-2 nova_compute[225701]: 2026-01-23 10:19:07.152 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:19:07 compute-2 nova_compute[225701]: 2026-01-23 10:19:07.153 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:19:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:07 compute-2 ceph-mon[75771]: pgmap v775: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 15 KiB/s rd, 16 KiB/s wr, 5 op/s
Jan 23 10:19:07 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1377108216' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:19:07 compute-2 nova_compute[225701]: 2026-01-23 10:19:07.957 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:08 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:08.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:08 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:08.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:09 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3477460864' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:19:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:09 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:09 compute-2 nova_compute[225701]: 2026-01-23 10:19:09.301 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/101909 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:19:09 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:19:09 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:19:09.846 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:19:09 compute-2 nova_compute[225701]: 2026-01-23 10:19:09.846 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:09 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:19:09.847 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:19:09 compute-2 ceph-mon[75771]: pgmap v776: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 13 KiB/s rd, 3.9 KiB/s wr, 4 op/s
Jan 23 10:19:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:10 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc0041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:10.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:10 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:10.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:11 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:11 compute-2 ceph-mon[75771]: pgmap v777: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 13 KiB/s rd, 3.9 KiB/s wr, 4 op/s
Jan 23 10:19:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:12 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:12.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:12 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004210 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:12.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:12 compute-2 nova_compute[225701]: 2026-01-23 10:19:12.959 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:13 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:13 compute-2 ceph-mon[75771]: pgmap v778: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 7.9 KiB/s wr, 5 op/s
Jan 23 10:19:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:13 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:19:13.850 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:19:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:14 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:14.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:14 compute-2 nova_compute[225701]: 2026-01-23 10:19:14.304 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:14 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:14 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:19:14 compute-2 podman[230656]: 2026-01-23 10:19:14.712874745 +0000 UTC m=+0.079217487 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 10:19:14 compute-2 podman[230655]: 2026-01-23 10:19:14.752222362 +0000 UTC m=+0.124137248 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 23 10:19:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:14.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:15 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:15 compute-2 ceph-mon[75771]: pgmap v779: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 5.1 KiB/s wr, 2 op/s
Jan 23 10:19:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:16 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:16.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:16 compute-2 sudo[230701]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:19:16 compute-2 sudo[230701]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:19:16 compute-2 sudo[230701]: pam_unix(sudo:session): session closed for user root
Jan 23 10:19:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:16 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:16 compute-2 sudo[230727]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:19:16 compute-2 sudo[230727]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:19:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:16.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:16 compute-2 sudo[230727]: pam_unix(sudo:session): session closed for user root
Jan 23 10:19:17 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:19:17 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/613545648' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:19:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:17 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:17 compute-2 ceph-mon[75771]: pgmap v780: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 36 KiB/s rd, 6.2 KiB/s wr, 30 op/s
Jan 23 10:19:17 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:19:17 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:19:17 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/613545648' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:19:17 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:19:17 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:19:17 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:19:17 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:19:17 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:19:17 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:19:17 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:19:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:18 compute-2 nova_compute[225701]: 2026-01-23 10:19:18.014 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:18 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:18.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:18 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:18.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:19 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:19 compute-2 nova_compute[225701]: 2026-01-23 10:19:19.307 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:19 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:19:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:19 compute-2 ceph-mon[75771]: pgmap v781: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 24 KiB/s rd, 5.2 KiB/s wr, 28 op/s
Jan 23 10:19:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:20 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:20.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:20 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:20.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:19:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:21 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:21 compute-2 ceph-mon[75771]: pgmap v782: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 24 KiB/s rd, 5.2 KiB/s wr, 28 op/s
Jan 23 10:19:21 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:19:21 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:19:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:22 compute-2 sudo[230791]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:19:22 compute-2 sudo[230791]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:19:22 compute-2 sudo[230791]: pam_unix(sudo:session): session closed for user root
Jan 23 10:19:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:22 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:22.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:22 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec003470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:22.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:22 compute-2 ceph-mon[75771]: pgmap v783: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 24 KiB/s rd, 5.2 KiB/s wr, 28 op/s
Jan 23 10:19:23 compute-2 nova_compute[225701]: 2026-01-23 10:19:23.016 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:23 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:23 compute-2 sudo[230817]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:19:23 compute-2 sudo[230817]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:19:23 compute-2 sudo[230817]: pam_unix(sudo:session): session closed for user root
Jan 23 10:19:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:24 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:24.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:24 compute-2 nova_compute[225701]: 2026-01-23 10:19:24.311 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:24 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:24 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:19:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:24.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:25 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec003470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:25 compute-2 ceph-mon[75771]: pgmap v784: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Jan 23 10:19:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:26 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:26.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:26 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:26 compute-2 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 23 10:19:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:26.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:27 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:27 compute-2 ceph-mon[75771]: pgmap v785: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Jan 23 10:19:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:28 compute-2 nova_compute[225701]: 2026-01-23 10:19:28.018 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:28 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec003470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:28.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:28 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc0042b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:28.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:29 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:29 compute-2 nova_compute[225701]: 2026-01-23 10:19:29.314 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:29 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:19:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:30 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:30 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec003470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:30.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:30 compute-2 ceph-mon[75771]: pgmap v786: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:19:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:30.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:31 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc0042d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:31 compute-2 ceph-mon[75771]: pgmap v787: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:19:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:32 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:32 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:32.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:32.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:33 compute-2 nova_compute[225701]: 2026-01-23 10:19:33.019 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:33 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec003470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:33 compute-2 ceph-mon[75771]: pgmap v788: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:19:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:34 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc0042f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:34 compute-2 nova_compute[225701]: 2026-01-23 10:19:34.317 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:34 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:34.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:34 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:19:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:34.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:35 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0003a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:35 compute-2 ceph-mon[75771]: pgmap v789: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:19:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:19:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:36 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0003a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:36 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004310 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:36.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:36.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:37 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:37 compute-2 ceph-mon[75771]: pgmap v790: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:19:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:38 compute-2 nova_compute[225701]: 2026-01-23 10:19:38.021 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:38 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0003a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:38 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004310 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:38.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:38.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec003470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:39 compute-2 nova_compute[225701]: 2026-01-23 10:19:39.319 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:39 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:19:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:40 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:40 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0003a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:40.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:40.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:41 compute-2 ceph-mon[75771]: pgmap v791: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:19:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:41 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004330 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:42 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec003470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:42 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3401891283' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:19:42 compute-2 ceph-mon[75771]: pgmap v792: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:19:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:42 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:42.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:42.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:43 compute-2 nova_compute[225701]: 2026-01-23 10:19:43.059 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:43 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0003a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:43 compute-2 ceph-mon[75771]: pgmap v793: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:19:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:43 compute-2 sudo[230864]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:19:43 compute-2 sudo[230864]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:19:43 compute-2 sudo[230864]: pam_unix(sudo:session): session closed for user root
Jan 23 10:19:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:44 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0003a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:44 compute-2 nova_compute[225701]: 2026-01-23 10:19:44.330 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:44 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec003470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:44.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:44 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:19:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:44.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:45 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:45 compute-2 podman[230892]: 2026-01-23 10:19:45.633438679 +0000 UTC m=+0.059250792 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:19:45 compute-2 podman[230891]: 2026-01-23 10:19:45.65941245 +0000 UTC m=+0.084003241 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 23 10:19:45 compute-2 ceph-mon[75771]: pgmap v794: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:19:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:46 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:46 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004390 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:46.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:46.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:47 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:48 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:48 compute-2 nova_compute[225701]: 2026-01-23 10:19:48.111 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:48 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0003a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:48.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:48 compute-2 ceph-mon[75771]: pgmap v795: 353 pgs: 353 active+clean; 88 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:19:48 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2913217029' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:19:48 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3310066042' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:19:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:48.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:49 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc0043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:49 compute-2 nova_compute[225701]: 2026-01-23 10:19:49.333 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:49 compute-2 ceph-mon[75771]: pgmap v796: 353 pgs: 353 active+clean; 88 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:19:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/2936068880' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:19:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/2936068880' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:19:49 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:19:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:50 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc0043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:50 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:50.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:19:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:50.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:51 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0003a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:51 compute-2 ceph-mon[75771]: pgmap v797: 353 pgs: 353 active+clean; 88 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:19:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:52 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0003a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:52 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:52.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:52.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:53 compute-2 nova_compute[225701]: 2026-01-23 10:19:53.115 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:53 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:53 compute-2 ceph-mon[75771]: pgmap v798: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.3 MiB/s rd, 1.8 MiB/s wr, 79 op/s
Jan 23 10:19:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:54 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:54 compute-2 nova_compute[225701]: 2026-01-23 10:19:54.335 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:54 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004550 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:54.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:54 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:19:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:54.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:55 compute-2 ceph-mon[75771]: pgmap v799: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.3 MiB/s rd, 1.8 MiB/s wr, 79 op/s
Jan 23 10:19:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:55 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:19:55.485 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:19:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:19:55.486 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:19:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:19:55.486 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:19:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:56 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:56 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:56.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:56.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:57 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004570 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:58 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004570 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:58 compute-2 nova_compute[225701]: 2026-01-23 10:19:58.118 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:58 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:58.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:58 compute-2 nova_compute[225701]: 2026-01-23 10:19:58.812 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:19:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:19:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:19:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:58.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:19:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:19:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:59 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:59 compute-2 nova_compute[225701]: 2026-01-23 10:19:59.368 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:00 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:00 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004590 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:00.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:00 compute-2 nova_compute[225701]: 2026-01-23 10:20:00.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:20:00 compute-2 nova_compute[225701]: 2026-01-23 10:20:00.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:20:00 compute-2 nova_compute[225701]: 2026-01-23 10:20:00.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:20:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:00 compute-2 nova_compute[225701]: 2026-01-23 10:20:00.799 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:20:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:20:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:00.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:20:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:01 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:02 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:02 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:02.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:02 compute-2 nova_compute[225701]: 2026-01-23 10:20:02.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:20:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:02.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:03 compute-2 nova_compute[225701]: 2026-01-23 10:20:03.121 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:03 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc0045b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:03 compute-2 ceph-mds[83039]: mds.beacon.cephfs.compute-2.prgzmm missed beacon ack from the monitors
Jan 23 10:20:03 compute-2 nova_compute[225701]: 2026-01-23 10:20:03.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:20:03 compute-2 nova_compute[225701]: 2026-01-23 10:20:03.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:20:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:03 compute-2 nova_compute[225701]: 2026-01-23 10:20:03.816 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:20:03 compute-2 nova_compute[225701]: 2026-01-23 10:20:03.816 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:20:03 compute-2 nova_compute[225701]: 2026-01-23 10:20:03.816 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:20:03 compute-2 nova_compute[225701]: 2026-01-23 10:20:03.817 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:20:03 compute-2 nova_compute[225701]: 2026-01-23 10:20:03.817 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:20:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:04 compute-2 sudo[230970]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:20:04 compute-2 sudo[230970]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:20:04 compute-2 sudo[230970]: pam_unix(sudo:session): session closed for user root
Jan 23 10:20:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:04 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:04 compute-2 nova_compute[225701]: 2026-01-23 10:20:04.371 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:04 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8003500 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:20:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:04.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:20:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:04.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:05 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:06 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc0045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:06 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency slow operation observed for kv_commit, latency = 5.380156040s
Jan 23 10:20:06 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency slow operation observed for kv_sync, latency = 5.380156517s
Jan 23 10:20:06 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency_fn slow operation observed for _txc_committed_kv, latency = 5.380500793s, txc = 0x559226356c00
Jan 23 10:20:06 compute-2 ceph-mon[75771]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 10:20:06 compute-2 ceph-mon[75771]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 23 10:20:06 compute-2 ceph-mon[75771]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 23 10:20:06 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 10:20:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:06 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:20:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:06.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:20:06 compute-2 nova_compute[225701]: 2026-01-23 10:20:06.617 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.799s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:20:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:06 compute-2 nova_compute[225701]: 2026-01-23 10:20:06.793 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:20:06 compute-2 nova_compute[225701]: 2026-01-23 10:20:06.794 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4912MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:20:06 compute-2 nova_compute[225701]: 2026-01-23 10:20:06.795 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:20:06 compute-2 nova_compute[225701]: 2026-01-23 10:20:06.795 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:20:06 compute-2 nova_compute[225701]: 2026-01-23 10:20:06.874 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:20:06 compute-2 nova_compute[225701]: 2026-01-23 10:20:06.874 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:20:06 compute-2 nova_compute[225701]: 2026-01-23 10:20:06.892 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:20:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:20:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:06.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:20:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:07 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:07 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:20:07 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1702015841' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:20:07 compute-2 nova_compute[225701]: 2026-01-23 10:20:07.339 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:20:07 compute-2 nova_compute[225701]: 2026-01-23 10:20:07.344 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:20:07 compute-2 nova_compute[225701]: 2026-01-23 10:20:07.361 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:20:07 compute-2 nova_compute[225701]: 2026-01-23 10:20:07.362 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:20:07 compute-2 nova_compute[225701]: 2026-01-23 10:20:07.362 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.568s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:20:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:08 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:08 compute-2 nova_compute[225701]: 2026-01-23 10:20:08.123 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:08 compute-2 nova_compute[225701]: 2026-01-23 10:20:08.363 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:20:08 compute-2 nova_compute[225701]: 2026-01-23 10:20:08.364 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:20:08 compute-2 nova_compute[225701]: 2026-01-23 10:20:08.364 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:20:08 compute-2 nova_compute[225701]: 2026-01-23 10:20:08.364 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:20:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:08 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc0045f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:08.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:08 compute-2 nova_compute[225701]: 2026-01-23 10:20:08.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:20:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:20:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:08.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:20:08 compute-2 ceph-mon[75771]: pgmap v801: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Jan 23 10:20:08 compute-2 ceph-mon[75771]: overall HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore
Jan 23 10:20:08 compute-2 ceph-mon[75771]: pgmap v802: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Jan 23 10:20:08 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1668936281' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:20:08 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2233825662' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:20:08 compute-2 ceph-mon[75771]: pgmap v803: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 81 op/s
Jan 23 10:20:08 compute-2 ceph-mon[75771]: pgmap v804: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 709 KiB/s rd, 28 op/s
Jan 23 10:20:08 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:20:08 compute-2 ceph-mon[75771]: mon.compute-1 calling monitor election
Jan 23 10:20:08 compute-2 ceph-mon[75771]: mon.compute-0 calling monitor election
Jan 23 10:20:08 compute-2 ceph-mon[75771]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Jan 23 10:20:08 compute-2 ceph-mon[75771]: monmap epoch 3
Jan 23 10:20:08 compute-2 ceph-mon[75771]: fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 10:20:08 compute-2 ceph-mon[75771]: last_changed 2026-01-23T09:50:47.540109+0000
Jan 23 10:20:08 compute-2 ceph-mon[75771]: created 2026-01-23T09:47:35.499222+0000
Jan 23 10:20:08 compute-2 ceph-mon[75771]: min_mon_release 19 (squid)
Jan 23 10:20:08 compute-2 ceph-mon[75771]: election_strategy: 1
Jan 23 10:20:08 compute-2 ceph-mon[75771]: 0: [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon.compute-0
Jan 23 10:20:08 compute-2 ceph-mon[75771]: 1: [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon.compute-2
Jan 23 10:20:08 compute-2 ceph-mon[75771]: 2: [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon.compute-1
Jan 23 10:20:08 compute-2 ceph-mon[75771]: fsmap cephfs:1 {0=cephfs.compute-2.prgzmm=up:active} 2 up:standby
Jan 23 10:20:08 compute-2 ceph-mon[75771]: osdmap e146: 3 total, 3 up, 3 in
Jan 23 10:20:08 compute-2 ceph-mon[75771]: mgrmap e32: compute-0.nbdygh(active, since 25m), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 10:20:08 compute-2 ceph-mon[75771]: Health detail: HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore
Jan 23 10:20:08 compute-2 ceph-mon[75771]: [WRN] BLUESTORE_SLOW_OP_ALERT: 1 OSD(s) experiencing slow operations in BlueStore
Jan 23 10:20:08 compute-2 ceph-mon[75771]:      osd.1 observed slow operation indications in BlueStore
Jan 23 10:20:08 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1281217101' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:20:08 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2381106667' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:20:08 compute-2 ceph-mon[75771]: pgmap v805: 353 pgs: 353 active+clean; 92 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 712 KiB/s rd, 588 KiB/s wr, 37 op/s
Jan 23 10:20:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:09 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:09 compute-2 nova_compute[225701]: 2026-01-23 10:20:09.374 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:10 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1702015841' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:20:10 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1764227055' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:20:10 compute-2 ceph-mon[75771]: pgmap v806: 353 pgs: 353 active+clean; 92 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 30 KiB/s rd, 588 KiB/s wr, 15 op/s
Jan 23 10:20:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:10 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8003500 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:10 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:20:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:10.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:20:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:10.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:11 compute-2 ceph-mon[75771]: pgmap v807: 353 pgs: 353 active+clean; 92 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 30 KiB/s rd, 588 KiB/s wr, 15 op/s
Jan 23 10:20:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:11 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004610 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:11 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:20:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:12 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:12 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8003500 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:20:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:12.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:20:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:12.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:13 compute-2 ceph-mon[75771]: Health check update: 2 OSD(s) experiencing slow operations in BlueStore (BLUESTORE_SLOW_OP_ALERT)
Jan 23 10:20:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:13 compute-2 nova_compute[225701]: 2026-01-23 10:20:13.125 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:13 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:14 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8003500 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:14 compute-2 nova_compute[225701]: 2026-01-23 10:20:14.377 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:14 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:14.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:14 compute-2 ceph-mon[75771]: pgmap v808: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 62 op/s
Jan 23 10:20:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:14.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:15 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004650 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:15 compute-2 ceph-mon[75771]: pgmap v809: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 299 KiB/s rd, 2.1 MiB/s wr, 55 op/s
Jan 23 10:20:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:16 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:16 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:20:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:16 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8003500 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:16.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:16 compute-2 podman[231043]: 2026-01-23 10:20:16.665790278 +0000 UTC m=+0.074812035 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 23 10:20:16 compute-2 podman[231042]: 2026-01-23 10:20:16.687763665 +0000 UTC m=+0.111199594 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 10:20:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:16.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:17 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:17 compute-2 ceph-mon[75771]: pgmap v810: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 300 KiB/s rd, 2.1 MiB/s wr, 56 op/s
Jan 23 10:20:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:18 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:18 compute-2 nova_compute[225701]: 2026-01-23 10:20:18.126 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:18 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:20:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:18.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:20:18 compute-2 nova_compute[225701]: 2026-01-23 10:20:18.564 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:18 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:20:18.564 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:20:18 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:20:18.566 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:20:18 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:20:18.568 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:20:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:18.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:19 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8003500 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:19 compute-2 nova_compute[225701]: 2026-01-23 10:20:19.571 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:19 compute-2 ceph-mon[75771]: pgmap v811: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 296 KiB/s rd, 1.6 MiB/s wr, 48 op/s
Jan 23 10:20:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:20 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:20 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:20.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:20.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:21 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:21 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:20:21 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:20:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:22 compute-2 ceph-mon[75771]: pgmap v812: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 296 KiB/s rd, 1.6 MiB/s wr, 48 op/s
Jan 23 10:20:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:22 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8003500 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:22 compute-2 sudo[231090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:20:22 compute-2 sudo[231090]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:20:22 compute-2 sudo[231090]: pam_unix(sudo:session): session closed for user root
Jan 23 10:20:22 compute-2 sudo[231115]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:20:22 compute-2 sudo[231115]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:20:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:22 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 10:20:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:22.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 10:20:22 compute-2 sudo[231115]: pam_unix(sudo:session): session closed for user root
Jan 23 10:20:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:22.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:23 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:23 compute-2 nova_compute[225701]: 2026-01-23 10:20:23.189 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:24 compute-2 ceph-mon[75771]: pgmap v813: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 296 KiB/s rd, 1.6 MiB/s wr, 48 op/s
Jan 23 10:20:24 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:20:24 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:20:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:24 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:24 compute-2 sudo[231178]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:20:24 compute-2 sudo[231178]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:20:24 compute-2 sudo[231178]: pam_unix(sudo:session): session closed for user root
Jan 23 10:20:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:24 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:20:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:24.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:20:24 compute-2 nova_compute[225701]: 2026-01-23 10:20:24.575 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:20:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:24.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:20:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:25 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:26 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:26 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:20:26 compute-2 ceph-mon[75771]: pgmap v814: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 15 KiB/s wr, 0 op/s
Jan 23 10:20:26 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:20:26 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/4205134929' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:20:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:26 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:26.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:26 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:20:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:20:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:26.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:20:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:27 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:28 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:28 compute-2 ceph-mon[75771]: pgmap v815: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 16 KiB/s wr, 1 op/s
Jan 23 10:20:28 compute-2 nova_compute[225701]: 2026-01-23 10:20:28.195 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:28 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:20:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:28.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:20:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:28.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:29 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:29 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:20:29 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:20:29 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:20:29 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:20:29 compute-2 ceph-mon[75771]: pgmap v816: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 3.3 KiB/s wr, 0 op/s
Jan 23 10:20:29 compute-2 nova_compute[225701]: 2026-01-23 10:20:29.577 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:30 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:30 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:20:30 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:20:30 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:20:30 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:20:30 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:20:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:30 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:30.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:20:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:30.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:20:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:31 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:31 compute-2 ceph-mon[75771]: pgmap v817: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 3.3 KiB/s wr, 0 op/s
Jan 23 10:20:31 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/733879018' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:20:31 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:20:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:32 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:32 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1524953951' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:20:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:32 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:32.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:20:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:32.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:20:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:33 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:33 compute-2 nova_compute[225701]: 2026-01-23 10:20:33.234 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:33 compute-2 ceph-mon[75771]: pgmap v818: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:20:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:34 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:34 compute-2 sudo[231213]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:20:34 compute-2 sudo[231213]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:20:34 compute-2 sudo[231213]: pam_unix(sudo:session): session closed for user root
Jan 23 10:20:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:34 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 10:20:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:34.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 10:20:34 compute-2 nova_compute[225701]: 2026-01-23 10:20:34.580 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:34.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:20:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:20:35 compute-2 ceph-mon[75771]: pgmap v819: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:20:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:35 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:36 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:36 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:20:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:36.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:36 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:36 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:20:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:20:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:36.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:20:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:37 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:37 compute-2 ceph-mon[75771]: pgmap v820: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Jan 23 10:20:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:38 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:38 compute-2 nova_compute[225701]: 2026-01-23 10:20:38.237 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:38 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:20:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:38.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:20:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:38.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:39 compute-2 ceph-mon[75771]: pgmap v821: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Jan 23 10:20:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:39 compute-2 nova_compute[225701]: 2026-01-23 10:20:39.582 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:40 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:20:40 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Cumulative writes: 5073 writes, 27K keys, 5073 commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.04 MB/s
                                           Cumulative WAL: 5073 writes, 5073 syncs, 1.00 writes per sync, written: 0.07 GB, 0.04 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1503 writes, 7262 keys, 1503 commit groups, 1.0 writes per commit group, ingest: 17.13 MB, 0.03 MB/s
                                           Interval WAL: 1504 writes, 1504 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     88.0      0.43              0.17        14    0.031       0      0       0.0       0.0
                                             L6      1/0   12.23 MB   0.0      0.2     0.0      0.1       0.2      0.0       0.0   4.3     86.9     75.1      2.14              0.71        13    0.165     68K   6779       0.0       0.0
                                            Sum      1/0   12.23 MB   0.0      0.2     0.0      0.1       0.2      0.0       0.0   5.3     72.5     77.2      2.57              0.88        27    0.095     68K   6779       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.3    102.5    101.7      0.71              0.24        10    0.071     29K   2602       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.2     0.0      0.1       0.2      0.0       0.0   0.0     86.9     75.1      2.14              0.71        13    0.165     68K   6779       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     88.5      0.43              0.17        13    0.033       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.037, interval 0.010
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.19 GB write, 0.11 MB/s write, 0.18 GB read, 0.10 MB/s read, 2.6 seconds
                                           Interval compaction: 0.07 GB write, 0.12 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.7 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c6513709b0#2 capacity: 304.00 MB usage: 13.48 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.000126 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(716,12.93 MB,4.25455%) FilterBlock(27,201.17 KB,0.064624%) IndexBlock(27,355.48 KB,0.114195%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Jan 23 10:20:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:40 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:40 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:20:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:40.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:20:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:20:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:40.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:20:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:41 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:41 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:20:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:41 compute-2 ceph-mon[75771]: pgmap v822: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Jan 23 10:20:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:42 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:42 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:42.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:42 compute-2 sshd-session[231242]: Connection reset by 198.235.24.211 port 61772 [preauth]
Jan 23 10:20:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:42.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:43 compute-2 ceph-mon[75771]: pgmap v823: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Jan 23 10:20:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:43 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:43 compute-2 nova_compute[225701]: 2026-01-23 10:20:43.239 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:44 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:44 compute-2 sudo[231250]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:20:44 compute-2 sudo[231250]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:20:44 compute-2 sudo[231250]: pam_unix(sudo:session): session closed for user root
Jan 23 10:20:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:44 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:20:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:44.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:20:44 compute-2 nova_compute[225701]: 2026-01-23 10:20:44.593 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:20:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:44.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:20:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:45 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:45 compute-2 ceph-mon[75771]: pgmap v824: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 15 KiB/s wr, 74 op/s
Jan 23 10:20:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:46 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:46 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:20:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:46.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:20:46 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:20:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:46.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:47 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:47 compute-2 podman[231280]: 2026-01-23 10:20:47.650935376 +0000 UTC m=+0.062769223 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:20:47 compute-2 podman[231279]: 2026-01-23 10:20:47.679808852 +0000 UTC m=+0.104317246 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 23 10:20:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:47 compute-2 ceph-mon[75771]: pgmap v825: 353 pgs: 353 active+clean; 192 MiB data, 348 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 120 op/s
Jan 23 10:20:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:48 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:48 compute-2 nova_compute[225701]: 2026-01-23 10:20:48.240 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:48 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:48.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:20:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:48.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:20:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/2314219766' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:20:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/2314219766' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:20:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:49 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:49 compute-2 nova_compute[225701]: 2026-01-23 10:20:49.595 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:50 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:50 compute-2 ceph-mon[75771]: pgmap v826: 353 pgs: 353 active+clean; 192 MiB data, 348 MiB used, 60 GiB / 60 GiB avail; 261 KiB/s rd, 2.0 MiB/s wr, 46 op/s
Jan 23 10:20:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:20:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:50 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:20:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:50.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:20:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:20:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:51.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:20:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:51 compute-2 ceph-mon[75771]: pgmap v827: 353 pgs: 353 active+clean; 192 MiB data, 348 MiB used, 60 GiB / 60 GiB avail; 261 KiB/s rd, 2.0 MiB/s wr, 46 op/s
Jan 23 10:20:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:51 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:51 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:20:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:52 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:52 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:52.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:53.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:53 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:53 compute-2 nova_compute[225701]: 2026-01-23 10:20:53.282 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:53 compute-2 ceph-mon[75771]: pgmap v828: 353 pgs: 353 active+clean; 200 MiB data, 348 MiB used, 60 GiB / 60 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Jan 23 10:20:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:54 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:54 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 10:20:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:54.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 10:20:54 compute-2 nova_compute[225701]: 2026-01-23 10:20:54.629 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:20:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:55.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:20:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:55 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:20:55.487 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:20:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:20:55.488 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:20:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:20:55.488 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:20:55 compute-2 ceph-mon[75771]: pgmap v829: 353 pgs: 353 active+clean; 200 MiB data, 348 MiB used, 60 GiB / 60 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 62 op/s
Jan 23 10:20:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:56 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:56 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:56.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:56 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:20:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:57.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:57 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:57 compute-2 ceph-mon[75771]: pgmap v830: 353 pgs: 353 active+clean; 121 MiB data, 341 MiB used, 60 GiB / 60 GiB avail; 341 KiB/s rd, 2.1 MiB/s wr, 85 op/s
Jan 23 10:20:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:58 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:58 compute-2 nova_compute[225701]: 2026-01-23 10:20:58.285 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:58 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:20:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:58.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:20:58 compute-2 nova_compute[225701]: 2026-01-23 10:20:58.779 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:20:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:58 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1700923100' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:20:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:20:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:59.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:59 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:59 compute-2 nova_compute[225701]: 2026-01-23 10:20:59.632 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:20:59 compute-2 ceph-mon[75771]: pgmap v831: 353 pgs: 353 active+clean; 121 MiB data, 341 MiB used, 60 GiB / 60 GiB avail; 79 KiB/s rd, 105 KiB/s wr, 39 op/s
Jan 23 10:21:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:00 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:00 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:00.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:00 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:21:00.662 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:21:00 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:21:00.663 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:21:00 compute-2 nova_compute[225701]: 2026-01-23 10:21:00.665 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:01.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:01 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/102101 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:21:01 compute-2 ceph-mon[75771]: pgmap v832: 353 pgs: 353 active+clean; 121 MiB data, 341 MiB used, 60 GiB / 60 GiB avail; 79 KiB/s rd, 105 KiB/s wr, 39 op/s
Jan 23 10:21:01 compute-2 nova_compute[225701]: 2026-01-23 10:21:01.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:21:01 compute-2 nova_compute[225701]: 2026-01-23 10:21:01.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:21:01 compute-2 nova_compute[225701]: 2026-01-23 10:21:01.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:21:01 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:21:01 compute-2 nova_compute[225701]: 2026-01-23 10:21:01.801 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:21:01 compute-2 nova_compute[225701]: 2026-01-23 10:21:01.801 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:21:01 compute-2 nova_compute[225701]: 2026-01-23 10:21:01.801 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 23 10:21:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:02 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:02 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:02.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:02 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:21:02.665 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:21:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:21:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:03.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:21:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:03 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:03 compute-2 nova_compute[225701]: 2026-01-23 10:21:03.288 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:03 compute-2 nova_compute[225701]: 2026-01-23 10:21:03.797 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:21:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:03 compute-2 ceph-mon[75771]: pgmap v833: 353 pgs: 353 active+clean; 121 MiB data, 305 MiB used, 60 GiB / 60 GiB avail; 86 KiB/s rd, 105 KiB/s wr, 48 op/s
Jan 23 10:21:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:04 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:04 compute-2 sudo[231340]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:21:04 compute-2 sudo[231340]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:21:04 compute-2 sudo[231340]: pam_unix(sudo:session): session closed for user root
Jan 23 10:21:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:04 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 10:21:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:04.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 10:21:04 compute-2 nova_compute[225701]: 2026-01-23 10:21:04.660 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 10:21:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:05.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 10:21:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:05 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:05 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3874868156' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:21:05 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/215747545' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:21:05 compute-2 nova_compute[225701]: 2026-01-23 10:21:05.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:21:05 compute-2 nova_compute[225701]: 2026-01-23 10:21:05.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:21:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:05 compute-2 nova_compute[225701]: 2026-01-23 10:21:05.813 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:21:05 compute-2 nova_compute[225701]: 2026-01-23 10:21:05.813 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:21:05 compute-2 nova_compute[225701]: 2026-01-23 10:21:05.814 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:21:05 compute-2 nova_compute[225701]: 2026-01-23 10:21:05.814 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:21:05 compute-2 nova_compute[225701]: 2026-01-23 10:21:05.814 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:21:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:06 compute-2 ceph-mon[75771]: pgmap v834: 353 pgs: 353 active+clean; 121 MiB data, 305 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 12 KiB/s wr, 31 op/s
Jan 23 10:21:06 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2548080809' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:21:06 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:21:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:06 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:06 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:21:06 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1928470914' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:21:06 compute-2 nova_compute[225701]: 2026-01-23 10:21:06.290 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:21:06 compute-2 nova_compute[225701]: 2026-01-23 10:21:06.455 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:21:06 compute-2 nova_compute[225701]: 2026-01-23 10:21:06.456 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4905MB free_disk=59.94269943237305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:21:06 compute-2 nova_compute[225701]: 2026-01-23 10:21:06.456 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:21:06 compute-2 nova_compute[225701]: 2026-01-23 10:21:06.456 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:21:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:06 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:06.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:06 compute-2 nova_compute[225701]: 2026-01-23 10:21:06.603 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:21:06 compute-2 nova_compute[225701]: 2026-01-23 10:21:06.604 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:21:06 compute-2 nova_compute[225701]: 2026-01-23 10:21:06.675 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:21:06 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:21:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:21:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:07.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:21:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:07 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:21:07 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/731693582' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:21:07 compute-2 nova_compute[225701]: 2026-01-23 10:21:07.084 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.409s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:21:07 compute-2 nova_compute[225701]: 2026-01-23 10:21:07.090 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:21:07 compute-2 nova_compute[225701]: 2026-01-23 10:21:07.103 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:21:07 compute-2 nova_compute[225701]: 2026-01-23 10:21:07.105 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:21:07 compute-2 nova_compute[225701]: 2026-01-23 10:21:07.105 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:21:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:07 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:08 compute-2 nova_compute[225701]: 2026-01-23 10:21:08.101 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:21:08 compute-2 nova_compute[225701]: 2026-01-23 10:21:08.119 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:21:08 compute-2 nova_compute[225701]: 2026-01-23 10:21:08.119 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:21:08 compute-2 nova_compute[225701]: 2026-01-23 10:21:08.119 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:21:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:08 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec00047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:08 compute-2 nova_compute[225701]: 2026-01-23 10:21:08.290 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:08 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:21:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:08.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:21:08 compute-2 nova_compute[225701]: 2026-01-23 10:21:08.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:21:08 compute-2 nova_compute[225701]: 2026-01-23 10:21:08.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:21:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:09.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:09 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:09 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1928470914' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:21:09 compute-2 ceph-mon[75771]: pgmap v835: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 39 KiB/s rd, 14 KiB/s wr, 56 op/s
Jan 23 10:21:09 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/731693582' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:21:09 compute-2 nova_compute[225701]: 2026-01-23 10:21:09.675 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:09 compute-2 nova_compute[225701]: 2026-01-23 10:21:09.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:21:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:09 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3840207814' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:21:09 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3507747902' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:21:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:10 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:21:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:10 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:10 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec00047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 10:21:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:10.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 10:21:10 compute-2 nova_compute[225701]: 2026-01-23 10:21:10.806 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:21:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:10 compute-2 nova_compute[225701]: 2026-01-23 10:21:10.806 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 23 10:21:10 compute-2 nova_compute[225701]: 2026-01-23 10:21:10.923 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 23 10:21:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:11.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:11 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:11 compute-2 ceph-mon[75771]: pgmap v836: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 23 KiB/s rd, 1.7 KiB/s wr, 34 op/s
Jan 23 10:21:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:12 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:21:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:12 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:12 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:21:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:12.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:21:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:12 compute-2 ceph-mon[75771]: pgmap v837: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 23 KiB/s rd, 1.7 KiB/s wr, 34 op/s
Jan 23 10:21:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:13.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:13 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:21:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:13 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:21:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:13 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec00047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:13 compute-2 nova_compute[225701]: 2026-01-23 10:21:13.348 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:13 compute-2 ceph-mon[75771]: pgmap v838: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 24 KiB/s rd, 2.3 KiB/s wr, 35 op/s
Jan 23 10:21:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:14 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:14 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:21:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:14.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:21:14 compute-2 nova_compute[225701]: 2026-01-23 10:21:14.729 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:15.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:15 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:15 compute-2 ceph-mon[75771]: pgmap v839: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.7 KiB/s wr, 26 op/s
Jan 23 10:21:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:16 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec00047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:16 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:21:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:16 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:16.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:17 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:21:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:21:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:17.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:21:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:17 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:18 compute-2 ceph-mon[75771]: pgmap v840: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 2.2 KiB/s wr, 28 op/s
Jan 23 10:21:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:18 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:18 compute-2 nova_compute[225701]: 2026-01-23 10:21:18.350 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:18 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:18.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:18 compute-2 podman[231427]: 2026-01-23 10:21:18.643433905 +0000 UTC m=+0.072705110 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 10:21:18 compute-2 podman[231426]: 2026-01-23 10:21:18.644225966 +0000 UTC m=+0.075841572 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 10:21:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:21:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:19.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:21:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:19 compute-2 ceph-mon[75771]: pgmap v841: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:21:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:19 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:19 compute-2 nova_compute[225701]: 2026-01-23 10:21:19.732 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:20 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec00047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:21:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:20 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:20.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:21.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:21 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:21 compute-2 ceph-mon[75771]: pgmap v842: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:21:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/102121 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:21:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:22 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:21:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:22 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:22 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec00047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 10:21:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:22.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 10:21:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:21:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:23.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:21:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:23 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec00047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:23 compute-2 nova_compute[225701]: 2026-01-23 10:21:23.351 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:23 compute-2 ceph-mon[75771]: pgmap v843: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:21:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:24 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec00047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:24 compute-2 sudo[231474]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:21:24 compute-2 sudo[231474]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:21:24 compute-2 sudo[231474]: pam_unix(sudo:session): session closed for user root
Jan 23 10:21:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:24 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 10:21:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:24.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 10:21:24 compute-2 nova_compute[225701]: 2026-01-23 10:21:24.734 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:25.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:25 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec00047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:26 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:26 compute-2 ceph-mon[75771]: pgmap v844: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 23 10:21:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:26 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:21:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:26.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:21:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:21:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:27.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:21:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:27 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:21:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:27 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:27 compute-2 ceph-mon[75771]: pgmap v845: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Jan 23 10:21:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:28 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:28 compute-2 nova_compute[225701]: 2026-01-23 10:21:28.375 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:28 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:28.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:29.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:29 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80017c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:29 compute-2 nova_compute[225701]: 2026-01-23 10:21:29.783 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:29 compute-2 ceph-mon[75771]: pgmap v846: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:21:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:30 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec002d40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:30 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec002d40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:30.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:31.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:31 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8004b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:31 compute-2 ceph-mon[75771]: pgmap v847: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:21:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:32 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec8001940 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:32 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:21:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:32 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:21:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:32.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:21:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:33.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:33 compute-2 nova_compute[225701]: 2026-01-23 10:21:33.377 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:33 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec002d40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:34 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8004b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:34 compute-2 ceph-mon[75771]: pgmap v848: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:21:34 compute-2 sudo[231509]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:21:34 compute-2 sudo[231509]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:21:34 compute-2 sudo[231509]: pam_unix(sudo:session): session closed for user root
Jan 23 10:21:34 compute-2 sudo[231535]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Jan 23 10:21:34 compute-2 sudo[231535]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:21:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:34 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec8001940 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 10:21:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:34.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 10:21:34 compute-2 nova_compute[225701]: 2026-01-23 10:21:34.785 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:34 compute-2 sudo[231535]: pam_unix(sudo:session): session closed for user root
Jan 23 10:21:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:21:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:35.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:21:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:35 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:35 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3733854137' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:21:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:36 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec0047c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:36 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8004b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:36.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:37.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:37 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec8001ae0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:37 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:21:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:37 compute-2 ceph-mon[75771]: pgmap v849: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:21:37 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:21:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:38 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:38 compute-2 sudo[231583]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:21:38 compute-2 sudo[231583]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:21:38 compute-2 sudo[231583]: pam_unix(sudo:session): session closed for user root
Jan 23 10:21:38 compute-2 sudo[231608]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:21:38 compute-2 sudo[231608]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:21:38 compute-2 nova_compute[225701]: 2026-01-23 10:21:38.382 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:38 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec0047c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:38.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:38 compute-2 sudo[231608]: pam_unix(sudo:session): session closed for user root
Jan 23 10:21:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:39.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:39 compute-2 ceph-mon[75771]: pgmap v850: 353 pgs: 353 active+clean; 62 MiB data, 257 MiB used, 60 GiB / 60 GiB avail; 8.4 KiB/s rd, 873 KiB/s wr, 14 op/s
Jan 23 10:21:39 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:21:39 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:21:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8004b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:39 compute-2 nova_compute[225701]: 2026-01-23 10:21:39.836 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:40 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80035d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:40 compute-2 ceph-mon[75771]: pgmap v851: 353 pgs: 353 active+clean; 84 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 8.5 KiB/s rd, 1.5 MiB/s wr, 16 op/s
Jan 23 10:21:40 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:21:40 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:21:40 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:21:40 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:21:40 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:21:40 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:21:40 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:21:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:40 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:40.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:41.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:41 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec0047c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:41 compute-2 ceph-mon[75771]: pgmap v852: 353 pgs: 353 active+clean; 84 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 8.5 KiB/s rd, 1.5 MiB/s wr, 16 op/s
Jan 23 10:21:41 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1204880216' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:21:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:42 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8004b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:42 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1226134693' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:21:42 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:21:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:42 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80035d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:21:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:42.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:21:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:43.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:43 compute-2 nova_compute[225701]: 2026-01-23 10:21:43.419 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:43 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:43 compute-2 ceph-mon[75771]: pgmap v853: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:21:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:44 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec0047c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:44 compute-2 sudo[231673]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:21:44 compute-2 sudo[231673]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:21:44 compute-2 sudo[231673]: pam_unix(sudo:session): session closed for user root
Jan 23 10:21:44 compute-2 sudo[231700]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:21:44 compute-2 sudo[231700]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:21:44 compute-2 sudo[231700]: pam_unix(sudo:session): session closed for user root
Jan 23 10:21:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:44 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8004b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:21:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:44.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:21:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:44 compute-2 nova_compute[225701]: 2026-01-23 10:21:44.839 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:21:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:45.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:21:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:45 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:21:45 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:21:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:45 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec8003ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:46 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:46 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec0047c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:46.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:46 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Jan 23 10:21:46 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:46.869627) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:21:46 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Jan 23 10:21:46 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163706869866, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 2404, "num_deletes": 251, "total_data_size": 6597834, "memory_usage": 6707424, "flush_reason": "Manual Compaction"}
Jan 23 10:21:46 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Jan 23 10:21:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:21:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:47.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:21:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:47 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8004b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:47 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:21:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163708075384, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 4220070, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25949, "largest_seqno": 28348, "table_properties": {"data_size": 4210435, "index_size": 6065, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20450, "raw_average_key_size": 20, "raw_value_size": 4190923, "raw_average_value_size": 4207, "num_data_blocks": 262, "num_entries": 996, "num_filter_entries": 996, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163491, "oldest_key_time": 1769163491, "file_creation_time": 1769163706, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:21:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:48 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec8003ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 1366208 microseconds, and 10178 cpu microseconds.
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:21:48 compute-2 ceph-mon[75771]: pgmap v854: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.075559) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 4220070 bytes OK
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.235949) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.241027) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.241075) EVENT_LOG_v1 {"time_micros": 1769163708241064, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.241114) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 6587235, prev total WAL file size 6589174, number of live WAL files 2.
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.242960) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032303038' seq:72057594037927935, type:22 .. '7061786F730032323630' seq:0, type:0; will stop at (end)
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(4121KB)], [51(12MB)]
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163708243148, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 17044356, "oldest_snapshot_seqno": -1}
Jan 23 10:21:48 compute-2 nova_compute[225701]: 2026-01-23 10:21:48.475 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 5924 keys, 14851976 bytes, temperature: kUnknown
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163708537791, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 14851976, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14811093, "index_size": 24965, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14853, "raw_key_size": 150592, "raw_average_key_size": 25, "raw_value_size": 14702654, "raw_average_value_size": 2481, "num_data_blocks": 1017, "num_entries": 5924, "num_filter_entries": 5924, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769163708, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:21:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:48 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c2d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 10:21:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:48.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.538143) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 14851976 bytes
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.781419) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 57.8 rd, 50.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 12.2 +0.0 blob) out(14.2 +0.0 blob), read-write-amplify(7.6) write-amplify(3.5) OK, records in: 6447, records dropped: 523 output_compression: NoCompression
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.781460) EVENT_LOG_v1 {"time_micros": 1769163708781445, "job": 30, "event": "compaction_finished", "compaction_time_micros": 294774, "compaction_time_cpu_micros": 37573, "output_level": 6, "num_output_files": 1, "total_output_size": 14851976, "num_input_records": 6447, "num_output_records": 5924, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163708782406, "job": 30, "event": "table_file_deletion", "file_number": 53}
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163708784519, "job": 30, "event": "table_file_deletion", "file_number": 51}
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.242801) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.784573) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.784577) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.784579) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.784581) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:21:48 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.784583) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:21:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:49.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:49 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec0047c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:49 compute-2 ceph-mon[75771]: pgmap v855: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 984 KiB/s rd, 1.8 MiB/s wr, 68 op/s
Jan 23 10:21:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/1255014004' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:21:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/1255014004' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:21:49 compute-2 podman[231730]: 2026-01-23 10:21:49.641586862 +0000 UTC m=+0.056916051 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:21:49 compute-2 podman[231729]: 2026-01-23 10:21:49.743501886 +0000 UTC m=+0.165190279 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:21:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:49 compute-2 nova_compute[225701]: 2026-01-23 10:21:49.841 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:50 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8004b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:50 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec8003ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:50.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:50 compute-2 nova_compute[225701]: 2026-01-23 10:21:50.780 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:21:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:51.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:51 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c2f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:51 compute-2 ceph-mon[75771]: pgmap v856: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 954 KiB/s wr, 86 op/s
Jan 23 10:21:51 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:21:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:52 compute-2 kernel: ganesha.nfsd[231499]: segfault at 50 ip 00007f2f772a832e sp 00007f2efe7fb210 error 4 in libntirpc.so.5.8[7f2f7728d000+2c000] likely on CPU 3 (core 0, socket 3)
Jan 23 10:21:52 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 10:21:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:52 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec0047c0 fd 38 proxy ignored for local
Jan 23 10:21:52 compute-2 systemd[1]: Started Process Core Dump (PID 231774/UID 0).
Jan 23 10:21:52 compute-2 ceph-mon[75771]: pgmap v857: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 269 KiB/s wr, 85 op/s
Jan 23 10:21:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:21:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:52.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:21:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:52 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:21:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:53.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:53 compute-2 ceph-mon[75771]: pgmap v858: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 269 KiB/s wr, 85 op/s
Jan 23 10:21:53 compute-2 systemd-coredump[231775]: Process 228967 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 75:
                                                    #0  0x00007f2f772a832e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Jan 23 10:21:53 compute-2 nova_compute[225701]: 2026-01-23 10:21:53.476 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:53 compute-2 systemd[1]: systemd-coredump@11-231774-0.service: Deactivated successfully.
Jan 23 10:21:53 compute-2 systemd[1]: systemd-coredump@11-231774-0.service: Consumed 1.174s CPU time.
Jan 23 10:21:53 compute-2 podman[231782]: 2026-01-23 10:21:53.544260032 +0000 UTC m=+0.023946659 container died fa785a85e35a7804c787f20020accc24473951046161ad46c7682dbaa03899c9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 10:21:53 compute-2 systemd[1]: var-lib-containers-storage-overlay-d753937e541bf38247f02c0eae4c66ab31e8c5f3996cd75a5da7d39e87934a76-merged.mount: Deactivated successfully.
Jan 23 10:21:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:54 compute-2 podman[231782]: 2026-01-23 10:21:54.096959953 +0000 UTC m=+0.576646590 container remove fa785a85e35a7804c787f20020accc24473951046161ad46c7682dbaa03899c9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 10:21:54 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Main process exited, code=exited, status=139/n/a
Jan 23 10:21:54 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 10:21:54 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.950s CPU time.
Jan 23 10:21:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 10:21:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:54.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 10:21:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:54 compute-2 nova_compute[225701]: 2026-01-23 10:21:54.842 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:55.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:21:55.488 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:21:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:21:55.488 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:21:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:21:55.489 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:21:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:56 compute-2 ceph-mon[75771]: pgmap v859: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Jan 23 10:21:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:56.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:57.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:57 compute-2 ceph-mon[75771]: pgmap v860: 353 pgs: 353 active+clean; 97 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 743 KiB/s wr, 85 op/s
Jan 23 10:21:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:57 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:21:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/102158 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:21:58 compute-2 nova_compute[225701]: 2026-01-23 10:21:58.478 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:58.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:58 compute-2 nova_compute[225701]: 2026-01-23 10:21:58.802 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:21:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:21:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:59.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:21:59 compute-2 nova_compute[225701]: 2026-01-23 10:21:59.884 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:59 compute-2 ceph-mon[75771]: pgmap v861: 353 pgs: 353 active+clean; 108 MiB data, 294 MiB used, 60 GiB / 60 GiB avail; 1.1 MiB/s rd, 1.8 MiB/s wr, 64 op/s
Jan 23 10:22:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:00.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:01 compute-2 ceph-mon[75771]: pgmap v862: 353 pgs: 353 active+clean; 108 MiB data, 294 MiB used, 60 GiB / 60 GiB avail; 91 KiB/s rd, 1.8 MiB/s wr, 30 op/s
Jan 23 10:22:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:22:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:01.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:22:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:02 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:22:02 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.1 total, 600.0 interval
                                           Cumulative writes: 7795 writes, 32K keys, 7795 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s
                                           Cumulative WAL: 7795 writes, 1759 syncs, 4.43 writes per sync, written: 0.03 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1988 writes, 7632 keys, 1988 commit groups, 1.0 writes per commit group, ingest: 8.26 MB, 0.01 MB/s
                                           Interval WAL: 1988 writes, 772 syncs, 2.58 writes per sync, written: 0.01 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 10:22:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:02.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:02 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:22:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:03.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:03 compute-2 nova_compute[225701]: 2026-01-23 10:22:03.479 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:03 compute-2 ceph-mon[75771]: pgmap v863: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 269 KiB/s rd, 2.1 MiB/s wr, 60 op/s
Jan 23 10:22:03 compute-2 nova_compute[225701]: 2026-01-23 10:22:03.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:22:03 compute-2 nova_compute[225701]: 2026-01-23 10:22:03.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:22:03 compute-2 nova_compute[225701]: 2026-01-23 10:22:03.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:22:03 compute-2 nova_compute[225701]: 2026-01-23 10:22:03.814 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:22:03 compute-2 nova_compute[225701]: 2026-01-23 10:22:03.814 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:22:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:04 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Scheduled restart job, restart counter is at 12.
Jan 23 10:22:04 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:22:04 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.950s CPU time.
Jan 23 10:22:04 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 10:22:04 compute-2 podman[231885]: 2026-01-23 10:22:04.49453168 +0000 UTC m=+0.037889860 container create f37b9193829c226bd3c386457514070a84ef21e291b2b2ba15bdbe8360f58f39 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1)
Jan 23 10:22:04 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b999db203518d32001aadbecd54d65860121f1de2ce596582814e75cf37a5783/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 10:22:04 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b999db203518d32001aadbecd54d65860121f1de2ce596582814e75cf37a5783/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 10:22:04 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b999db203518d32001aadbecd54d65860121f1de2ce596582814e75cf37a5783/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:22:04 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b999db203518d32001aadbecd54d65860121f1de2ce596582814e75cf37a5783/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.tykohi-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:22:04 compute-2 podman[231885]: 2026-01-23 10:22:04.556786209 +0000 UTC m=+0.100144399 container init f37b9193829c226bd3c386457514070a84ef21e291b2b2ba15bdbe8360f58f39 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_REF=squid, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 10:22:04 compute-2 podman[231885]: 2026-01-23 10:22:04.56263667 +0000 UTC m=+0.105994830 container start f37b9193829c226bd3c386457514070a84ef21e291b2b2ba15bdbe8360f58f39 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:22:04 compute-2 bash[231885]: f37b9193829c226bd3c386457514070a84ef21e291b2b2ba15bdbe8360f58f39
Jan 23 10:22:04 compute-2 podman[231885]: 2026-01-23 10:22:04.478030074 +0000 UTC m=+0.021388254 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 10:22:04 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:22:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:04 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 10:22:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:04 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 10:22:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:04 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 10:22:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:04 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 10:22:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:04 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 10:22:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:04 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 10:22:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:04 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 10:22:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:04.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:04 compute-2 sudo[231913]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:22:04 compute-2 sudo[231913]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:22:04 compute-2 sudo[231913]: pam_unix(sudo:session): session closed for user root
Jan 23 10:22:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:04 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:22:04 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1253421413' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:22:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:04 compute-2 nova_compute[225701]: 2026-01-23 10:22:04.886 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:05.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:05 compute-2 ceph-mon[75771]: pgmap v864: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 269 KiB/s rd, 2.1 MiB/s wr, 60 op/s
Jan 23 10:22:05 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1615457187' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:22:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:22:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:06.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:06 compute-2 nova_compute[225701]: 2026-01-23 10:22:06.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:22:06 compute-2 nova_compute[225701]: 2026-01-23 10:22:06.812 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:22:06 compute-2 nova_compute[225701]: 2026-01-23 10:22:06.813 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:22:06 compute-2 nova_compute[225701]: 2026-01-23 10:22:06.813 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:22:06 compute-2 nova_compute[225701]: 2026-01-23 10:22:06.813 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:22:06 compute-2 nova_compute[225701]: 2026-01-23 10:22:06.814 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:22:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:22:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:07.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:22:07 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:22:07 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2351850048' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:22:07 compute-2 nova_compute[225701]: 2026-01-23 10:22:07.280 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:22:07 compute-2 nova_compute[225701]: 2026-01-23 10:22:07.437 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:22:07 compute-2 nova_compute[225701]: 2026-01-23 10:22:07.438 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4844MB free_disk=59.942752838134766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:22:07 compute-2 nova_compute[225701]: 2026-01-23 10:22:07.438 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:22:07 compute-2 nova_compute[225701]: 2026-01-23 10:22:07.438 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:22:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:07 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:22:07 compute-2 ceph-mon[75771]: pgmap v865: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 270 KiB/s rd, 2.1 MiB/s wr, 61 op/s
Jan 23 10:22:07 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2351850048' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:22:07 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/748902986' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:22:07 compute-2 nova_compute[225701]: 2026-01-23 10:22:07.994 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:22:07 compute-2 nova_compute[225701]: 2026-01-23 10:22:07.994 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:22:08 compute-2 nova_compute[225701]: 2026-01-23 10:22:08.021 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:22:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:08 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:22:08 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1160861138' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:22:08 compute-2 nova_compute[225701]: 2026-01-23 10:22:08.454 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:22:08 compute-2 nova_compute[225701]: 2026-01-23 10:22:08.462 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:22:08 compute-2 nova_compute[225701]: 2026-01-23 10:22:08.481 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:08 compute-2 nova_compute[225701]: 2026-01-23 10:22:08.515 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:22:08 compute-2 nova_compute[225701]: 2026-01-23 10:22:08.518 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:22:08 compute-2 nova_compute[225701]: 2026-01-23 10:22:08.518 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:22:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:22:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:08.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:22:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:08 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1160861138' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:22:08 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1277093325' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:22:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:22:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:09.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:22:09 compute-2 nova_compute[225701]: 2026-01-23 10:22:09.519 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:22:09 compute-2 nova_compute[225701]: 2026-01-23 10:22:09.520 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:22:09 compute-2 nova_compute[225701]: 2026-01-23 10:22:09.520 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:22:09 compute-2 nova_compute[225701]: 2026-01-23 10:22:09.520 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:22:09 compute-2 nova_compute[225701]: 2026-01-23 10:22:09.520 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:22:09 compute-2 nova_compute[225701]: 2026-01-23 10:22:09.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:22:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:09 compute-2 nova_compute[225701]: 2026-01-23 10:22:09.933 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:09 compute-2 ceph-mon[75771]: pgmap v866: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 249 KiB/s rd, 1.4 MiB/s wr, 49 op/s
Jan 23 10:22:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:10.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:10 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:22:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:10 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:22:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:11 compute-2 ceph-mon[75771]: pgmap v867: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 179 KiB/s rd, 339 KiB/s wr, 30 op/s
Jan 23 10:22:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:22:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:11.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:22:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:12.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:12 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:22:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:13.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:13 compute-2 nova_compute[225701]: 2026-01-23 10:22:13.515 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:13 compute-2 ceph-mon[75771]: pgmap v868: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 180 KiB/s rd, 342 KiB/s wr, 32 op/s
Jan 23 10:22:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:14.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:14 compute-2 nova_compute[225701]: 2026-01-23 10:22:14.936 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:22:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:15.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:22:15 compute-2 ceph-mon[75771]: pgmap v869: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 15 KiB/s wr, 3 op/s
Jan 23 10:22:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:16.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 10:22:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:22:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:17.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:17 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce00000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:17 compute-2 ceph-mon[75771]: pgmap v870: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 2.8 KiB/s rd, 16 KiB/s wr, 4 op/s
Jan 23 10:22:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:17 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:22:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:18 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcdf8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:18 compute-2 nova_compute[225701]: 2026-01-23 10:22:18.517 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:18 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcde4000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:18.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:19.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:19 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce00000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:19 compute-2 ceph-mon[75771]: pgmap v871: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 3.6 KiB/s wr, 3 op/s
Jan 23 10:22:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:19 compute-2 nova_compute[225701]: 2026-01-23 10:22:19.982 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/102220 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:22:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:20 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcde8000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:20 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcdf8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:20 compute-2 podman[232046]: 2026-01-23 10:22:20.621934199 +0000 UTC m=+0.043429663 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:22:20 compute-2 podman[232045]: 2026-01-23 10:22:20.652125279 +0000 UTC m=+0.075435789 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:22:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:20.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:22:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:22:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:21.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:22:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:21 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcdf8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:21 compute-2 ceph-mon[75771]: pgmap v872: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 3.2 KiB/s wr, 3 op/s
Jan 23 10:22:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:22 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcde40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:22 compute-2 nova_compute[225701]: 2026-01-23 10:22:22.618 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:22:22 compute-2 nova_compute[225701]: 2026-01-23 10:22:22.618 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:22:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:22 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcddc000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:22 compute-2 nova_compute[225701]: 2026-01-23 10:22:22.646 225706 DEBUG nova.compute.manager [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 10:22:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:22.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:22 compute-2 nova_compute[225701]: 2026-01-23 10:22:22.737 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:22:22 compute-2 nova_compute[225701]: 2026-01-23 10:22:22.737 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:22:22 compute-2 nova_compute[225701]: 2026-01-23 10:22:22.743 225706 DEBUG nova.virt.hardware [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 10:22:22 compute-2 nova_compute[225701]: 2026-01-23 10:22:22.743 225706 INFO nova.compute.claims [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Claim successful on node compute-2.ctlplane.example.com
Jan 23 10:22:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:22 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:22:22 compute-2 nova_compute[225701]: 2026-01-23 10:22:22.887 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:22:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:22:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:23.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:22:23 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:22:23 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1943420009' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:22:23 compute-2 nova_compute[225701]: 2026-01-23 10:22:23.361 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:22:23 compute-2 nova_compute[225701]: 2026-01-23 10:22:23.366 225706 DEBUG nova.compute.provider_tree [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:22:23 compute-2 nova_compute[225701]: 2026-01-23 10:22:23.381 225706 DEBUG nova.scheduler.client.report [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:22:23 compute-2 nova_compute[225701]: 2026-01-23 10:22:23.400 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.662s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:22:23 compute-2 nova_compute[225701]: 2026-01-23 10:22:23.401 225706 DEBUG nova.compute.manager [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 10:22:23 compute-2 nova_compute[225701]: 2026-01-23 10:22:23.454 225706 DEBUG nova.compute.manager [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 10:22:23 compute-2 nova_compute[225701]: 2026-01-23 10:22:23.454 225706 DEBUG nova.network.neutron [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 10:22:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:23 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcddc000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:23 compute-2 nova_compute[225701]: 2026-01-23 10:22:23.475 225706 INFO nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 10:22:23 compute-2 nova_compute[225701]: 2026-01-23 10:22:23.505 225706 DEBUG nova.compute.manager [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 10:22:23 compute-2 nova_compute[225701]: 2026-01-23 10:22:23.520 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:23 compute-2 nova_compute[225701]: 2026-01-23 10:22:23.624 225706 DEBUG nova.compute.manager [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 10:22:23 compute-2 nova_compute[225701]: 2026-01-23 10:22:23.626 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 10:22:23 compute-2 nova_compute[225701]: 2026-01-23 10:22:23.626 225706 INFO nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Creating image(s)
Jan 23 10:22:23 compute-2 nova_compute[225701]: 2026-01-23 10:22:23.657 225706 DEBUG nova.storage.rbd_utils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:22:23 compute-2 nova_compute[225701]: 2026-01-23 10:22:23.686 225706 DEBUG nova.storage.rbd_utils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:22:23 compute-2 nova_compute[225701]: 2026-01-23 10:22:23.709 225706 DEBUG nova.storage.rbd_utils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:22:23 compute-2 nova_compute[225701]: 2026-01-23 10:22:23.712 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:22:23 compute-2 nova_compute[225701]: 2026-01-23 10:22:23.731 225706 DEBUG nova.policy [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f459c4e71e6c47acb0f8aaf83f34695e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 10:22:23 compute-2 nova_compute[225701]: 2026-01-23 10:22:23.769 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:22:23 compute-2 nova_compute[225701]: 2026-01-23 10:22:23.770 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "379b2821245bc82aa5a95839eddb9a97716b559c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:22:23 compute-2 nova_compute[225701]: 2026-01-23 10:22:23.772 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:22:23 compute-2 nova_compute[225701]: 2026-01-23 10:22:23.772 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:22:23 compute-2 nova_compute[225701]: 2026-01-23 10:22:23.802 225706 DEBUG nova.storage.rbd_utils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:22:23 compute-2 nova_compute[225701]: 2026-01-23 10:22:23.804 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:22:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:23 compute-2 ceph-mon[75771]: pgmap v873: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 2.8 KiB/s rd, 6.6 KiB/s wr, 3 op/s
Jan 23 10:22:23 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1943420009' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:22:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:24 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcdf8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:24 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:24.264 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:22:24 compute-2 nova_compute[225701]: 2026-01-23 10:22:24.264 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:24 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:24.266 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:22:24 compute-2 nova_compute[225701]: 2026-01-23 10:22:24.457 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.652s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:22:24 compute-2 nova_compute[225701]: 2026-01-23 10:22:24.528 225706 DEBUG nova.storage.rbd_utils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] resizing rbd image b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Jan 23 10:22:24 compute-2 nova_compute[225701]: 2026-01-23 10:22:24.567 225706 DEBUG nova.network.neutron [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Successfully created port: 06aeb511-67a6-4547-b061-9c4760285e3b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 10:22:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:24 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcde40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:24 compute-2 nova_compute[225701]: 2026-01-23 10:22:24.633 225706 DEBUG nova.objects.instance [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'migration_context' on Instance uuid b8ea49c6-5f62-47b0-92cc-7399bfc98528 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:22:24 compute-2 nova_compute[225701]: 2026-01-23 10:22:24.652 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 10:22:24 compute-2 nova_compute[225701]: 2026-01-23 10:22:24.652 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Ensure instance console log exists: /var/lib/nova/instances/b8ea49c6-5f62-47b0-92cc-7399bfc98528/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 10:22:24 compute-2 nova_compute[225701]: 2026-01-23 10:22:24.653 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:22:24 compute-2 nova_compute[225701]: 2026-01-23 10:22:24.653 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:22:24 compute-2 nova_compute[225701]: 2026-01-23 10:22:24.654 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:22:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:24.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:24 compute-2 sudo[232278]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:22:24 compute-2 sudo[232278]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:22:24 compute-2 sudo[232278]: pam_unix(sudo:session): session closed for user root
Jan 23 10:22:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:24 compute-2 nova_compute[225701]: 2026-01-23 10:22:24.984 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:22:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:25.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:22:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:25 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcddc000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:25 compute-2 nova_compute[225701]: 2026-01-23 10:22:25.498 225706 DEBUG nova.network.neutron [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Successfully updated port: 06aeb511-67a6-4547-b061-9c4760285e3b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 10:22:25 compute-2 nova_compute[225701]: 2026-01-23 10:22:25.515 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "refresh_cache-b8ea49c6-5f62-47b0-92cc-7399bfc98528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:22:25 compute-2 nova_compute[225701]: 2026-01-23 10:22:25.516 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquired lock "refresh_cache-b8ea49c6-5f62-47b0-92cc-7399bfc98528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:22:25 compute-2 nova_compute[225701]: 2026-01-23 10:22:25.516 225706 DEBUG nova.network.neutron [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 10:22:25 compute-2 nova_compute[225701]: 2026-01-23 10:22:25.616 225706 DEBUG nova.compute.manager [req-7aa7b5f5-8a1d-4b3b-a50f-15159e9e0526 req-d33d09b3-b3c7-4755-a54b-ffe8af031b12 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Received event network-changed-06aeb511-67a6-4547-b061-9c4760285e3b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:22:25 compute-2 nova_compute[225701]: 2026-01-23 10:22:25.617 225706 DEBUG nova.compute.manager [req-7aa7b5f5-8a1d-4b3b-a50f-15159e9e0526 req-d33d09b3-b3c7-4755-a54b-ffe8af031b12 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Refreshing instance network info cache due to event network-changed-06aeb511-67a6-4547-b061-9c4760285e3b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 10:22:25 compute-2 nova_compute[225701]: 2026-01-23 10:22:25.618 225706 DEBUG oslo_concurrency.lockutils [req-7aa7b5f5-8a1d-4b3b-a50f-15159e9e0526 req-d33d09b3-b3c7-4755-a54b-ffe8af031b12 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-b8ea49c6-5f62-47b0-92cc-7399bfc98528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:22:25 compute-2 nova_compute[225701]: 2026-01-23 10:22:25.699 225706 DEBUG nova.network.neutron [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 10:22:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:25 compute-2 ceph-mon[75771]: pgmap v874: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 3.7 KiB/s wr, 1 op/s
Jan 23 10:22:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:26 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcddc000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:26 compute-2 nova_compute[225701]: 2026-01-23 10:22:26.558 225706 DEBUG nova.network.neutron [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Updating instance_info_cache with network_info: [{"id": "06aeb511-67a6-4547-b061-9c4760285e3b", "address": "fa:16:3e:39:69:9c", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06aeb511-67", "ovs_interfaceid": "06aeb511-67a6-4547-b061-9c4760285e3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:22:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:26 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcdf8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:26.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:26 compute-2 nova_compute[225701]: 2026-01-23 10:22:26.734 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Releasing lock "refresh_cache-b8ea49c6-5f62-47b0-92cc-7399bfc98528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:22:26 compute-2 nova_compute[225701]: 2026-01-23 10:22:26.734 225706 DEBUG nova.compute.manager [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Instance network_info: |[{"id": "06aeb511-67a6-4547-b061-9c4760285e3b", "address": "fa:16:3e:39:69:9c", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06aeb511-67", "ovs_interfaceid": "06aeb511-67a6-4547-b061-9c4760285e3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 10:22:26 compute-2 nova_compute[225701]: 2026-01-23 10:22:26.735 225706 DEBUG oslo_concurrency.lockutils [req-7aa7b5f5-8a1d-4b3b-a50f-15159e9e0526 req-d33d09b3-b3c7-4755-a54b-ffe8af031b12 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-b8ea49c6-5f62-47b0-92cc-7399bfc98528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:22:26 compute-2 nova_compute[225701]: 2026-01-23 10:22:26.735 225706 DEBUG nova.network.neutron [req-7aa7b5f5-8a1d-4b3b-a50f-15159e9e0526 req-d33d09b3-b3c7-4755-a54b-ffe8af031b12 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Refreshing network info cache for port 06aeb511-67a6-4547-b061-9c4760285e3b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 10:22:26 compute-2 nova_compute[225701]: 2026-01-23 10:22:26.737 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Start _get_guest_xml network_info=[{"id": "06aeb511-67a6-4547-b061-9c4760285e3b", "address": "fa:16:3e:39:69:9c", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06aeb511-67", "ovs_interfaceid": "06aeb511-67a6-4547-b061-9c4760285e3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_options': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '271ec98e-d058-421b-bbfb-4b4a5954c90a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 10:22:26 compute-2 nova_compute[225701]: 2026-01-23 10:22:26.741 225706 WARNING nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:22:26 compute-2 nova_compute[225701]: 2026-01-23 10:22:26.745 225706 DEBUG nova.virt.libvirt.host [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 10:22:26 compute-2 nova_compute[225701]: 2026-01-23 10:22:26.745 225706 DEBUG nova.virt.libvirt.host [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 10:22:26 compute-2 nova_compute[225701]: 2026-01-23 10:22:26.749 225706 DEBUG nova.virt.libvirt.host [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 10:22:26 compute-2 nova_compute[225701]: 2026-01-23 10:22:26.749 225706 DEBUG nova.virt.libvirt.host [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 10:22:26 compute-2 nova_compute[225701]: 2026-01-23 10:22:26.750 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 10:22:26 compute-2 nova_compute[225701]: 2026-01-23 10:22:26.750 225706 DEBUG nova.virt.hardware [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T10:15:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1d8c8bf4-786e-4009-bc53-f259480fb5b3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 10:22:26 compute-2 nova_compute[225701]: 2026-01-23 10:22:26.751 225706 DEBUG nova.virt.hardware [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 10:22:26 compute-2 nova_compute[225701]: 2026-01-23 10:22:26.751 225706 DEBUG nova.virt.hardware [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 10:22:26 compute-2 nova_compute[225701]: 2026-01-23 10:22:26.751 225706 DEBUG nova.virt.hardware [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 10:22:26 compute-2 nova_compute[225701]: 2026-01-23 10:22:26.752 225706 DEBUG nova.virt.hardware [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 10:22:26 compute-2 nova_compute[225701]: 2026-01-23 10:22:26.752 225706 DEBUG nova.virt.hardware [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 10:22:26 compute-2 nova_compute[225701]: 2026-01-23 10:22:26.752 225706 DEBUG nova.virt.hardware [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 10:22:26 compute-2 nova_compute[225701]: 2026-01-23 10:22:26.753 225706 DEBUG nova.virt.hardware [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 10:22:26 compute-2 nova_compute[225701]: 2026-01-23 10:22:26.753 225706 DEBUG nova.virt.hardware [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 10:22:26 compute-2 nova_compute[225701]: 2026-01-23 10:22:26.753 225706 DEBUG nova.virt.hardware [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 10:22:26 compute-2 nova_compute[225701]: 2026-01-23 10:22:26.754 225706 DEBUG nova.virt.hardware [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 10:22:26 compute-2 nova_compute[225701]: 2026-01-23 10:22:26.758 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:22:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:27 compute-2 ceph-mon[75771]: pgmap v875: 353 pgs: 353 active+clean; 144 MiB data, 309 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 246 KiB/s wr, 27 op/s
Jan 23 10:22:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:27.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:27 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 10:22:27 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/815013215' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.332 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.573s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.368 225706 DEBUG nova.storage.rbd_utils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.375 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:22:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:27 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcde40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:27 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 10:22:27 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2638325462' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.840 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.842 225706 DEBUG nova.virt.libvirt.vif [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:22:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1878761076',display_name='tempest-TestNetworkBasicOps-server-1878761076',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1878761076',id=7,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCnVJwsrFV8aI2nrJoEZl6VyyMwYmX81xfzmKsfGpDRm0DGXIQaGmDmPINRbdeF1kx8Y5VA3JSgU3fPoWzBbPsDeXm0p5hq8BrMWr1cPqMrGzO08egHCDlwB5XDUgBL1OA==',key_name='tempest-TestNetworkBasicOps-1503023412',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-qvac9ktg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:22:23Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=b8ea49c6-5f62-47b0-92cc-7399bfc98528,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "06aeb511-67a6-4547-b061-9c4760285e3b", "address": "fa:16:3e:39:69:9c", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06aeb511-67", "ovs_interfaceid": "06aeb511-67a6-4547-b061-9c4760285e3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.842 225706 DEBUG nova.network.os_vif_util [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "06aeb511-67a6-4547-b061-9c4760285e3b", "address": "fa:16:3e:39:69:9c", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06aeb511-67", "ovs_interfaceid": "06aeb511-67a6-4547-b061-9c4760285e3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.843 225706 DEBUG nova.network.os_vif_util [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:69:9c,bridge_name='br-int',has_traffic_filtering=True,id=06aeb511-67a6-4547-b061-9c4760285e3b,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06aeb511-67') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.844 225706 DEBUG nova.objects.instance [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'pci_devices' on Instance uuid b8ea49c6-5f62-47b0-92cc-7399bfc98528 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:22:27 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.967 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] End _get_guest_xml xml=<domain type="kvm">
Jan 23 10:22:27 compute-2 nova_compute[225701]:   <uuid>b8ea49c6-5f62-47b0-92cc-7399bfc98528</uuid>
Jan 23 10:22:27 compute-2 nova_compute[225701]:   <name>instance-00000007</name>
Jan 23 10:22:27 compute-2 nova_compute[225701]:   <memory>131072</memory>
Jan 23 10:22:27 compute-2 nova_compute[225701]:   <vcpu>1</vcpu>
Jan 23 10:22:27 compute-2 nova_compute[225701]:   <metadata>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <nova:name>tempest-TestNetworkBasicOps-server-1878761076</nova:name>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <nova:creationTime>2026-01-23 10:22:26</nova:creationTime>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <nova:flavor name="m1.nano">
Jan 23 10:22:27 compute-2 nova_compute[225701]:         <nova:memory>128</nova:memory>
Jan 23 10:22:27 compute-2 nova_compute[225701]:         <nova:disk>1</nova:disk>
Jan 23 10:22:27 compute-2 nova_compute[225701]:         <nova:swap>0</nova:swap>
Jan 23 10:22:27 compute-2 nova_compute[225701]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 10:22:27 compute-2 nova_compute[225701]:         <nova:vcpus>1</nova:vcpus>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       </nova:flavor>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <nova:owner>
Jan 23 10:22:27 compute-2 nova_compute[225701]:         <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 10:22:27 compute-2 nova_compute[225701]:         <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       </nova:owner>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <nova:ports>
Jan 23 10:22:27 compute-2 nova_compute[225701]:         <nova:port uuid="06aeb511-67a6-4547-b061-9c4760285e3b">
Jan 23 10:22:27 compute-2 nova_compute[225701]:           <nova:ip type="fixed" address="10.100.0.25" ipVersion="4"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:         </nova:port>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       </nova:ports>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     </nova:instance>
Jan 23 10:22:27 compute-2 nova_compute[225701]:   </metadata>
Jan 23 10:22:27 compute-2 nova_compute[225701]:   <sysinfo type="smbios">
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <system>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <entry name="manufacturer">RDO</entry>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <entry name="product">OpenStack Compute</entry>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <entry name="serial">b8ea49c6-5f62-47b0-92cc-7399bfc98528</entry>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <entry name="uuid">b8ea49c6-5f62-47b0-92cc-7399bfc98528</entry>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <entry name="family">Virtual Machine</entry>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     </system>
Jan 23 10:22:27 compute-2 nova_compute[225701]:   </sysinfo>
Jan 23 10:22:27 compute-2 nova_compute[225701]:   <os>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <boot dev="hd"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <smbios mode="sysinfo"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:   </os>
Jan 23 10:22:27 compute-2 nova_compute[225701]:   <features>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <acpi/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <apic/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <vmcoreinfo/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:   </features>
Jan 23 10:22:27 compute-2 nova_compute[225701]:   <clock offset="utc">
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <timer name="hpet" present="no"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:   </clock>
Jan 23 10:22:27 compute-2 nova_compute[225701]:   <cpu mode="host-model" match="exact">
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:   </cpu>
Jan 23 10:22:27 compute-2 nova_compute[225701]:   <devices>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <disk type="network" device="disk">
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <driver type="raw" cache="none"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <source protocol="rbd" name="vms/b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk">
Jan 23 10:22:27 compute-2 nova_compute[225701]:         <host name="192.168.122.100" port="6789"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:         <host name="192.168.122.102" port="6789"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:         <host name="192.168.122.101" port="6789"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       </source>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <auth username="openstack">
Jan 23 10:22:27 compute-2 nova_compute[225701]:         <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       </auth>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <target dev="vda" bus="virtio"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     </disk>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <disk type="network" device="cdrom">
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <driver type="raw" cache="none"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <source protocol="rbd" name="vms/b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk.config">
Jan 23 10:22:27 compute-2 nova_compute[225701]:         <host name="192.168.122.100" port="6789"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:         <host name="192.168.122.102" port="6789"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:         <host name="192.168.122.101" port="6789"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       </source>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <auth username="openstack">
Jan 23 10:22:27 compute-2 nova_compute[225701]:         <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       </auth>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <target dev="sda" bus="sata"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     </disk>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <interface type="ethernet">
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <mac address="fa:16:3e:39:69:9c"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <model type="virtio"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <mtu size="1442"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <target dev="tap06aeb511-67"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     </interface>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <serial type="pty">
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <log file="/var/lib/nova/instances/b8ea49c6-5f62-47b0-92cc-7399bfc98528/console.log" append="off"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     </serial>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <video>
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <model type="virtio"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     </video>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <input type="tablet" bus="usb"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <rng model="virtio">
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <backend model="random">/dev/urandom</backend>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     </rng>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <controller type="usb" index="0"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     <memballoon model="virtio">
Jan 23 10:22:27 compute-2 nova_compute[225701]:       <stats period="10"/>
Jan 23 10:22:27 compute-2 nova_compute[225701]:     </memballoon>
Jan 23 10:22:27 compute-2 nova_compute[225701]:   </devices>
Jan 23 10:22:27 compute-2 nova_compute[225701]: </domain>
Jan 23 10:22:27 compute-2 nova_compute[225701]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.968 225706 DEBUG nova.compute.manager [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Preparing to wait for external event network-vif-plugged-06aeb511-67a6-4547-b061-9c4760285e3b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.969 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.969 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.969 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.970 225706 DEBUG nova.virt.libvirt.vif [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:22:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1878761076',display_name='tempest-TestNetworkBasicOps-server-1878761076',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1878761076',id=7,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCnVJwsrFV8aI2nrJoEZl6VyyMwYmX81xfzmKsfGpDRm0DGXIQaGmDmPINRbdeF1kx8Y5VA3JSgU3fPoWzBbPsDeXm0p5hq8BrMWr1cPqMrGzO08egHCDlwB5XDUgBL1OA==',key_name='tempest-TestNetworkBasicOps-1503023412',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-qvac9ktg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:22:23Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=b8ea49c6-5f62-47b0-92cc-7399bfc98528,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "06aeb511-67a6-4547-b061-9c4760285e3b", "address": "fa:16:3e:39:69:9c", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06aeb511-67", "ovs_interfaceid": "06aeb511-67a6-4547-b061-9c4760285e3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.970 225706 DEBUG nova.network.os_vif_util [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "06aeb511-67a6-4547-b061-9c4760285e3b", "address": "fa:16:3e:39:69:9c", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06aeb511-67", "ovs_interfaceid": "06aeb511-67a6-4547-b061-9c4760285e3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.971 225706 DEBUG nova.network.os_vif_util [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:69:9c,bridge_name='br-int',has_traffic_filtering=True,id=06aeb511-67a6-4547-b061-9c4760285e3b,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06aeb511-67') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.971 225706 DEBUG os_vif [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:69:9c,bridge_name='br-int',has_traffic_filtering=True,id=06aeb511-67a6-4547-b061-9c4760285e3b,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06aeb511-67') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.972 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.973 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.973 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.977 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.977 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06aeb511-67, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.978 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap06aeb511-67, col_values=(('external_ids', {'iface-id': '06aeb511-67a6-4547-b061-9c4760285e3b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:39:69:9c', 'vm-uuid': 'b8ea49c6-5f62-47b0-92cc-7399bfc98528'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.979 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:27 compute-2 NetworkManager[48964]: <info>  [1769163747.9808] manager: (tap06aeb511-67): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.981 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.987 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:27 compute-2 nova_compute[225701]: 2026-01-23 10:22:27.988 225706 INFO os_vif [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:69:9c,bridge_name='br-int',has_traffic_filtering=True,id=06aeb511-67a6-4547-b061-9c4760285e3b,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06aeb511-67')
Jan 23 10:22:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:28 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/815013215' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:22:28 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2638325462' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:22:28 compute-2 nova_compute[225701]: 2026-01-23 10:22:28.170 225706 DEBUG nova.network.neutron [req-7aa7b5f5-8a1d-4b3b-a50f-15159e9e0526 req-d33d09b3-b3c7-4755-a54b-ffe8af031b12 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Updated VIF entry in instance network info cache for port 06aeb511-67a6-4547-b061-9c4760285e3b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 10:22:28 compute-2 nova_compute[225701]: 2026-01-23 10:22:28.172 225706 DEBUG nova.network.neutron [req-7aa7b5f5-8a1d-4b3b-a50f-15159e9e0526 req-d33d09b3-b3c7-4755-a54b-ffe8af031b12 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Updating instance_info_cache with network_info: [{"id": "06aeb511-67a6-4547-b061-9c4760285e3b", "address": "fa:16:3e:39:69:9c", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06aeb511-67", "ovs_interfaceid": "06aeb511-67a6-4547-b061-9c4760285e3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:22:28 compute-2 nova_compute[225701]: 2026-01-23 10:22:28.184 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 10:22:28 compute-2 nova_compute[225701]: 2026-01-23 10:22:28.185 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 10:22:28 compute-2 nova_compute[225701]: 2026-01-23 10:22:28.185 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No VIF found with MAC fa:16:3e:39:69:9c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 10:22:28 compute-2 nova_compute[225701]: 2026-01-23 10:22:28.186 225706 INFO nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Using config drive
Jan 23 10:22:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:28 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcde40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:28 compute-2 nova_compute[225701]: 2026-01-23 10:22:28.214 225706 DEBUG nova.storage.rbd_utils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:22:28 compute-2 nova_compute[225701]: 2026-01-23 10:22:28.221 225706 DEBUG oslo_concurrency.lockutils [req-7aa7b5f5-8a1d-4b3b-a50f-15159e9e0526 req-d33d09b3-b3c7-4755-a54b-ffe8af031b12 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-b8ea49c6-5f62-47b0-92cc-7399bfc98528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:22:28 compute-2 nova_compute[225701]: 2026-01-23 10:22:28.522 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:28 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce00008b50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:22:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:28.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:22:28 compute-2 nova_compute[225701]: 2026-01-23 10:22:28.719 225706 INFO nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Creating config drive at /var/lib/nova/instances/b8ea49c6-5f62-47b0-92cc-7399bfc98528/disk.config
Jan 23 10:22:28 compute-2 nova_compute[225701]: 2026-01-23 10:22:28.726 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b8ea49c6-5f62-47b0-92cc-7399bfc98528/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwxdhw6_t execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:22:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:28 compute-2 nova_compute[225701]: 2026-01-23 10:22:28.853 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b8ea49c6-5f62-47b0-92cc-7399bfc98528/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwxdhw6_t" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:22:28 compute-2 nova_compute[225701]: 2026-01-23 10:22:28.884 225706 DEBUG nova.storage.rbd_utils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:22:28 compute-2 nova_compute[225701]: 2026-01-23 10:22:28.888 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b8ea49c6-5f62-47b0-92cc-7399bfc98528/disk.config b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:22:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:29 compute-2 nova_compute[225701]: 2026-01-23 10:22:29.076 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b8ea49c6-5f62-47b0-92cc-7399bfc98528/disk.config b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.188s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:22:29 compute-2 nova_compute[225701]: 2026-01-23 10:22:29.077 225706 INFO nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Deleting local config drive /var/lib/nova/instances/b8ea49c6-5f62-47b0-92cc-7399bfc98528/disk.config because it was imported into RBD.
Jan 23 10:22:29 compute-2 systemd[1]: Starting libvirt secret daemon...
Jan 23 10:22:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:29.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:29 compute-2 systemd[1]: Started libvirt secret daemon.
Jan 23 10:22:29 compute-2 ceph-mon[75771]: pgmap v876: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:22:29 compute-2 kernel: tap06aeb511-67: entered promiscuous mode
Jan 23 10:22:29 compute-2 NetworkManager[48964]: <info>  [1769163749.1719] manager: (tap06aeb511-67): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Jan 23 10:22:29 compute-2 ovn_controller[132789]: 2026-01-23T10:22:29Z|00037|binding|INFO|Claiming lport 06aeb511-67a6-4547-b061-9c4760285e3b for this chassis.
Jan 23 10:22:29 compute-2 ovn_controller[132789]: 2026-01-23T10:22:29Z|00038|binding|INFO|06aeb511-67a6-4547-b061-9c4760285e3b: Claiming fa:16:3e:39:69:9c 10.100.0.25
Jan 23 10:22:29 compute-2 nova_compute[225701]: 2026-01-23 10:22:29.174 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:29 compute-2 nova_compute[225701]: 2026-01-23 10:22:29.177 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.185 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:69:9c 10.100.0.25'], port_security=['fa:16:3e:39:69:9c 10.100.0.25'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': 'b8ea49c6-5f62-47b0-92cc-7399bfc98528', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a09a282-aa22-47cf-a68d-ce0dba493868', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a9d12c65-6e30-4f8d-be47-424dc8b73a1d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8dee54ab-ce3c-4b4e-ac76-15d1824a947d, chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>], logical_port=06aeb511-67a6-4547-b061-9c4760285e3b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.186 142606 INFO neutron.agent.ovn.metadata.agent [-] Port 06aeb511-67a6-4547-b061-9c4760285e3b in datapath 6a09a282-aa22-47cf-a68d-ce0dba493868 bound to our chassis
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.188 142606 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a09a282-aa22-47cf-a68d-ce0dba493868
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.206 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[bbc3b7a0-db6b-4fe6-92d5-424931dfaba9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.207 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6a09a282-a1 in ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 10:22:29 compute-2 systemd-machined[194368]: New machine qemu-2-instance-00000007.
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.209 229823 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6a09a282-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.210 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[994d1fd9-c473-4a6c-ad2b-48da4d520f6c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.210 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[ed323c88-2687-4baf-a761-07c3614dbd89]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:29 compute-2 nova_compute[225701]: 2026-01-23 10:22:29.216 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:29 compute-2 ovn_controller[132789]: 2026-01-23T10:22:29Z|00039|binding|INFO|Setting lport 06aeb511-67a6-4547-b061-9c4760285e3b ovn-installed in OVS
Jan 23 10:22:29 compute-2 ovn_controller[132789]: 2026-01-23T10:22:29Z|00040|binding|INFO|Setting lport 06aeb511-67a6-4547-b061-9c4760285e3b up in Southbound
Jan 23 10:22:29 compute-2 nova_compute[225701]: 2026-01-23 10:22:29.221 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:29 compute-2 systemd[1]: Started Virtual Machine qemu-2-instance-00000007.
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.227 142723 DEBUG oslo.privsep.daemon [-] privsep: reply[6ed09ec5-7143-4b01-939e-6bb989ee2ba1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:29 compute-2 systemd-udevd[232463]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 10:22:29 compute-2 NetworkManager[48964]: <info>  [1769163749.2417] device (tap06aeb511-67): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 10:22:29 compute-2 NetworkManager[48964]: <info>  [1769163749.2423] device (tap06aeb511-67): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.243 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[b584c21b-2138-4546-9e74-7596049bc21b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.274 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[dba45c58-3d96-400c-b746-5e7bf5dfc3f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.279 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[69446257-bf41-4ff6-bbdd-cf21151daa1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:29 compute-2 NetworkManager[48964]: <info>  [1769163749.2816] manager: (tap6a09a282-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.312 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[0430c7e1-bfd7-4d2f-9f6a-5eea8b979138]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.315 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[7a4dbbee-3d58-44c6-8b6d-27efdc5a8ddd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:29 compute-2 NetworkManager[48964]: <info>  [1769163749.3356] device (tap6a09a282-a0): carrier: link connected
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.341 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[1b7dfbf1-4b31-4fe0-ab4c-6a20e7611c17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.356 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[d99023eb-843f-4a2a-9514-3ad3d757f417]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a09a282-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e9:9b:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490198, 'reachable_time': 30952, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232494, 'error': None, 'target': 'ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.370 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[0a29cf73-efb1-45dc-a294-7ba77a7759e2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee9:9ba3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 490198, 'tstamp': 490198}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232495, 'error': None, 'target': 'ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.384 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[1bde2b59-c9f7-4ff3-9a7f-3dba601ca08e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a09a282-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e9:9b:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490198, 'reachable_time': 30952, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232496, 'error': None, 'target': 'ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.414 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[27b79cc3-53c6-43be-8328-fdd8b99ed94e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:29 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcdf8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.468 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[b597dd07-4a0f-4698-95b7-241b1d3ab055]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.469 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a09a282-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.469 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.470 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a09a282-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:22:29 compute-2 kernel: tap6a09a282-a0: entered promiscuous mode
Jan 23 10:22:29 compute-2 NetworkManager[48964]: <info>  [1769163749.4724] manager: (tap6a09a282-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Jan 23 10:22:29 compute-2 nova_compute[225701]: 2026-01-23 10:22:29.471 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:29 compute-2 nova_compute[225701]: 2026-01-23 10:22:29.473 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.474 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a09a282-a0, col_values=(('external_ids', {'iface-id': 'f3eaa8c6-94ad-445d-ab48-59e26f30c078'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:22:29 compute-2 nova_compute[225701]: 2026-01-23 10:22:29.475 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:29 compute-2 ovn_controller[132789]: 2026-01-23T10:22:29Z|00041|binding|INFO|Releasing lport f3eaa8c6-94ad-445d-ab48-59e26f30c078 from this chassis (sb_readonly=0)
Jan 23 10:22:29 compute-2 nova_compute[225701]: 2026-01-23 10:22:29.493 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:29 compute-2 nova_compute[225701]: 2026-01-23 10:22:29.494 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.494 142606 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6a09a282-aa22-47cf-a68d-ce0dba493868.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6a09a282-aa22-47cf-a68d-ce0dba493868.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.495 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[94d333b0-67ba-41f4-a5ed-750bfb3471e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.496 142606 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: global
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]:     log         /dev/log local0 debug
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]:     log-tag     haproxy-metadata-proxy-6a09a282-aa22-47cf-a68d-ce0dba493868
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]:     user        root
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]:     group       root
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]:     maxconn     1024
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]:     pidfile     /var/lib/neutron/external/pids/6a09a282-aa22-47cf-a68d-ce0dba493868.pid.haproxy
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]:     daemon
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: defaults
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]:     log global
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]:     mode http
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]:     option httplog
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]:     option dontlognull
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]:     option http-server-close
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]:     option forwardfor
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]:     retries                 3
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]:     timeout http-request    30s
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]:     timeout connect         30s
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]:     timeout client          32s
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]:     timeout server          32s
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]:     timeout http-keep-alive 30s
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: listen listener
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]:     bind 169.254.169.254:80
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]:     http-request add-header X-OVN-Network-ID 6a09a282-aa22-47cf-a68d-ce0dba493868
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 10:22:29 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.497 142606 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868', 'env', 'PROCESS_TAG=haproxy-6a09a282-aa22-47cf-a68d-ce0dba493868', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6a09a282-aa22-47cf-a68d-ce0dba493868.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 10:22:29 compute-2 nova_compute[225701]: 2026-01-23 10:22:29.516 225706 DEBUG nova.compute.manager [req-36c50f77-7ba3-4d8b-ada3-6c219e2293cb req-b0504862-8e59-4ddb-bde8-7de9421b3f62 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Received event network-vif-plugged-06aeb511-67a6-4547-b061-9c4760285e3b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:22:29 compute-2 nova_compute[225701]: 2026-01-23 10:22:29.516 225706 DEBUG oslo_concurrency.lockutils [req-36c50f77-7ba3-4d8b-ada3-6c219e2293cb req-b0504862-8e59-4ddb-bde8-7de9421b3f62 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:22:29 compute-2 nova_compute[225701]: 2026-01-23 10:22:29.517 225706 DEBUG oslo_concurrency.lockutils [req-36c50f77-7ba3-4d8b-ada3-6c219e2293cb req-b0504862-8e59-4ddb-bde8-7de9421b3f62 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:22:29 compute-2 nova_compute[225701]: 2026-01-23 10:22:29.517 225706 DEBUG oslo_concurrency.lockutils [req-36c50f77-7ba3-4d8b-ada3-6c219e2293cb req-b0504862-8e59-4ddb-bde8-7de9421b3f62 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:22:29 compute-2 nova_compute[225701]: 2026-01-23 10:22:29.517 225706 DEBUG nova.compute.manager [req-36c50f77-7ba3-4d8b-ada3-6c219e2293cb req-b0504862-8e59-4ddb-bde8-7de9421b3f62 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Processing event network-vif-plugged-06aeb511-67a6-4547-b061-9c4760285e3b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 10:22:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:29 compute-2 podman[232526]: 2026-01-23 10:22:29.850597953 +0000 UTC m=+0.046128653 container create dc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 10:22:29 compute-2 systemd[1]: Started libpod-conmon-dc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a.scope.
Jan 23 10:22:29 compute-2 systemd[1]: Started libcrun container.
Jan 23 10:22:29 compute-2 podman[232526]: 2026-01-23 10:22:29.826652034 +0000 UTC m=+0.022182754 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 10:22:29 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47e998d09fd3fbb881a05bc419ded091790e7655197f061ed433d1f7bd8e2ac3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 10:22:29 compute-2 podman[232526]: 2026-01-23 10:22:29.942775995 +0000 UTC m=+0.138306715 container init dc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 10:22:29 compute-2 podman[232526]: 2026-01-23 10:22:29.948125213 +0000 UTC m=+0.143655913 container start dc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 23 10:22:29 compute-2 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[232542]: [NOTICE]   (232562) : New worker (232566) forked
Jan 23 10:22:29 compute-2 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[232542]: [NOTICE]   (232562) : Loading success.
Jan 23 10:22:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.137 225706 DEBUG nova.compute.manager [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.138 225706 DEBUG nova.virt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Emitting event <LifecycleEvent: 1769163750.1371768, b8ea49c6-5f62-47b0-92cc-7399bfc98528 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.138 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] VM Started (Lifecycle Event)
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.141 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.144 225706 INFO nova.virt.libvirt.driver [-] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Instance spawned successfully.
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.144 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.161 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.167 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.171 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.171 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.172 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.172 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.173 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.173 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.202 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.203 225706 DEBUG nova.virt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Emitting event <LifecycleEvent: 1769163750.1374104, b8ea49c6-5f62-47b0-92cc-7399bfc98528 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.203 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] VM Paused (Lifecycle Event)
Jan 23 10:22:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:30 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcdf8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.225 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.228 225706 DEBUG nova.virt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Emitting event <LifecycleEvent: 1769163750.1405203, b8ea49c6-5f62-47b0-92cc-7399bfc98528 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.228 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] VM Resumed (Lifecycle Event)
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.239 225706 INFO nova.compute.manager [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Took 6.61 seconds to spawn the instance on the hypervisor.
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.239 225706 DEBUG nova.compute.manager [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.250 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.253 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.283 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.303 225706 INFO nova.compute.manager [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Took 7.60 seconds to build instance.
Jan 23 10:22:30 compute-2 nova_compute[225701]: 2026-01-23 10:22:30.325 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.707s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:22:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:30 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcddc002720 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:30.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:31.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:31 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce00009470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:31 compute-2 nova_compute[225701]: 2026-01-23 10:22:31.631 225706 DEBUG nova.compute.manager [req-80ad4137-f60b-490f-ba41-a2a7113a12a6 req-98ece8f4-a4e2-4d5a-a88d-fc93a449e5b5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Received event network-vif-plugged-06aeb511-67a6-4547-b061-9c4760285e3b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:22:31 compute-2 nova_compute[225701]: 2026-01-23 10:22:31.632 225706 DEBUG oslo_concurrency.lockutils [req-80ad4137-f60b-490f-ba41-a2a7113a12a6 req-98ece8f4-a4e2-4d5a-a88d-fc93a449e5b5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:22:31 compute-2 nova_compute[225701]: 2026-01-23 10:22:31.633 225706 DEBUG oslo_concurrency.lockutils [req-80ad4137-f60b-490f-ba41-a2a7113a12a6 req-98ece8f4-a4e2-4d5a-a88d-fc93a449e5b5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:22:31 compute-2 nova_compute[225701]: 2026-01-23 10:22:31.633 225706 DEBUG oslo_concurrency.lockutils [req-80ad4137-f60b-490f-ba41-a2a7113a12a6 req-98ece8f4-a4e2-4d5a-a88d-fc93a449e5b5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:22:31 compute-2 nova_compute[225701]: 2026-01-23 10:22:31.633 225706 DEBUG nova.compute.manager [req-80ad4137-f60b-490f-ba41-a2a7113a12a6 req-98ece8f4-a4e2-4d5a-a88d-fc93a449e5b5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] No waiting events found dispatching network-vif-plugged-06aeb511-67a6-4547-b061-9c4760285e3b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:22:31 compute-2 nova_compute[225701]: 2026-01-23 10:22:31.633 225706 WARNING nova.compute.manager [req-80ad4137-f60b-490f-ba41-a2a7113a12a6 req-98ece8f4-a4e2-4d5a-a88d-fc93a449e5b5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Received unexpected event network-vif-plugged-06aeb511-67a6-4547-b061-9c4760285e3b for instance with vm_state active and task_state None.
Jan 23 10:22:31 compute-2 ceph-mon[75771]: pgmap v877: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:22:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:32 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcde4002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:32 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:32.270 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:22:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:32 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcdf8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:32.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:32 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:22:32 compute-2 nova_compute[225701]: 2026-01-23 10:22:32.980 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:33.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:33 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcddc002720 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:33 compute-2 nova_compute[225701]: 2026-01-23 10:22:33.524 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:33 compute-2 ceph-mon[75771]: pgmap v878: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 102 op/s
Jan 23 10:22:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:34 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce00009470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:34 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcde4002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:34.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:22:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:35.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:22:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:35 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcdf8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:35 compute-2 ceph-mon[75771]: pgmap v879: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Jan 23 10:22:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:22:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:36 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcddc003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:36 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce0000a180 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:36.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:37.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:37 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcde4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:37 compute-2 ceph-mon[75771]: pgmap v880: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 1.8 MiB/s wr, 186 op/s
Jan 23 10:22:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:37 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:22:37 compute-2 nova_compute[225701]: 2026-01-23 10:22:37.983 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:38 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcdf8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:38 compute-2 nova_compute[225701]: 2026-01-23 10:22:38.525 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:38 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcddc003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:38.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:39.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:39 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce0000a180 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:39 compute-2 ceph-mon[75771]: pgmap v881: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 1.6 MiB/s wr, 207 op/s
Jan 23 10:22:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:40 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcde4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:40 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcdf8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:22:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:40.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:22:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:22:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:41.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:22:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:41 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcddc003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:41 compute-2 ceph-mon[75771]: pgmap v882: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 17 KiB/s wr, 206 op/s
Jan 23 10:22:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:42 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce0000a180 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:42 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcde4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:42.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:42 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:22:42 compute-2 nova_compute[225701]: 2026-01-23 10:22:42.985 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:43.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:43 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcde4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:43 compute-2 nova_compute[225701]: 2026-01-23 10:22:43.528 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:43 compute-2 ovn_controller[132789]: 2026-01-23T10:22:43Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:39:69:9c 10.100.0.25
Jan 23 10:22:43 compute-2 ovn_controller[132789]: 2026-01-23T10:22:43Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:39:69:9c 10.100.0.25
Jan 23 10:22:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:43 compute-2 ceph-mon[75771]: pgmap v883: 353 pgs: 353 active+clean; 175 MiB data, 332 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 1.2 MiB/s wr, 286 op/s
Jan 23 10:22:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:44 compute-2 kernel: ganesha.nfsd[232028]: segfault at 50 ip 00007fce8c07632e sp 00007fce127fb210 error 4 in libntirpc.so.5.8[7fce8c05b000+2c000] likely on CPU 1 (core 0, socket 1)
Jan 23 10:22:44 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 10:22:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:44 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcdf8003cc0 fd 39 proxy ignored for local
Jan 23 10:22:44 compute-2 systemd[1]: Started Process Core Dump (PID 232614/UID 0).
Jan 23 10:22:44 compute-2 sudo[232616]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:22:44 compute-2 sudo[232616]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:22:44 compute-2 sudo[232616]: pam_unix(sudo:session): session closed for user root
Jan 23 10:22:44 compute-2 sudo[232642]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:22:44 compute-2 sudo[232642]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:22:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:22:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:44.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:22:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:44 compute-2 sudo[232683]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:22:44 compute-2 sudo[232683]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:22:44 compute-2 sudo[232683]: pam_unix(sudo:session): session closed for user root
Jan 23 10:22:45 compute-2 sudo[232642]: pam_unix(sudo:session): session closed for user root
Jan 23 10:22:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:45.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:45 compute-2 ceph-mon[75771]: pgmap v884: 353 pgs: 353 active+clean; 175 MiB data, 332 MiB used, 60 GiB / 60 GiB avail; 248 KiB/s rd, 1.1 MiB/s wr, 211 op/s
Jan 23 10:22:45 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:22:45 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:22:45 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:22:45 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:22:45 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:22:45 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:22:45 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:22:45 compute-2 systemd-coredump[232615]: Process 231905 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 43:
                                                    #0  0x00007fce8c07632e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Jan 23 10:22:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:46 compute-2 systemd[1]: systemd-coredump@12-232614-0.service: Deactivated successfully.
Jan 23 10:22:46 compute-2 systemd[1]: systemd-coredump@12-232614-0.service: Consumed 1.770s CPU time.
Jan 23 10:22:46 compute-2 podman[232729]: 2026-01-23 10:22:46.135171732 +0000 UTC m=+0.026270122 container died f37b9193829c226bd3c386457514070a84ef21e291b2b2ba15bdbe8360f58f39 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 10:22:46 compute-2 systemd[1]: var-lib-containers-storage-overlay-b999db203518d32001aadbecd54d65860121f1de2ce596582814e75cf37a5783-merged.mount: Deactivated successfully.
Jan 23 10:22:46 compute-2 podman[232729]: 2026-01-23 10:22:46.176343276 +0000 UTC m=+0.067441656 container remove f37b9193829c226bd3c386457514070a84ef21e291b2b2ba15bdbe8360f58f39 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:22:46 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Main process exited, code=exited, status=139/n/a
Jan 23 10:22:46 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 10:22:46 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.533s CPU time.
Jan 23 10:22:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:22:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:46.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:22:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:22:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:47.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:22:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:47 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:22:47 compute-2 ceph-mon[75771]: pgmap v885: 353 pgs: 353 active+clean; 199 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 460 KiB/s rd, 2.1 MiB/s wr, 243 op/s
Jan 23 10:22:47 compute-2 nova_compute[225701]: 2026-01-23 10:22:47.988 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:48 compute-2 nova_compute[225701]: 2026-01-23 10:22:48.529 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:48 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 23 10:22:48 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2595657408' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:22:48 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 23 10:22:48 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2595657408' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:22:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:48.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:48 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/2595657408' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:22:48 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/2595657408' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:22:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:49.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:49 compute-2 sudo[232775]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:22:49 compute-2 sudo[232775]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:22:49 compute-2 sudo[232775]: pam_unix(sudo:session): session closed for user root
Jan 23 10:22:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:49 compute-2 ceph-mon[75771]: pgmap v886: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 446 KiB/s rd, 2.1 MiB/s wr, 162 op/s
Jan 23 10:22:49 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:22:49 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:22:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/102250 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:22:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:22:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:50.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:22:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:22:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:22:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:51.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:22:51 compute-2 podman[232803]: 2026-01-23 10:22:51.635823462 +0000 UTC m=+0.054810959 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 23 10:22:51 compute-2 podman[232802]: 2026-01-23 10:22:51.665604519 +0000 UTC m=+0.092184691 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:22:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:51 compute-2 ceph-mon[75771]: pgmap v887: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 417 KiB/s rd, 2.1 MiB/s wr, 114 op/s
Jan 23 10:22:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:22:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:52.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:22:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:52 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:22:52 compute-2 ceph-mon[75771]: pgmap v888: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 417 KiB/s rd, 2.2 MiB/s wr, 115 op/s
Jan 23 10:22:52 compute-2 nova_compute[225701]: 2026-01-23 10:22:52.990 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:53.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:53 compute-2 nova_compute[225701]: 2026-01-23 10:22:53.532 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:54.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:22:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:55.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:22:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:55.489 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:22:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:55.490 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:22:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:55.490 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:22:55 compute-2 ceph-mon[75771]: pgmap v889: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 253 KiB/s rd, 1.0 MiB/s wr, 35 op/s
Jan 23 10:22:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:56 compute-2 nova_compute[225701]: 2026-01-23 10:22:56.401 225706 DEBUG oslo_concurrency.lockutils [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:22:56 compute-2 nova_compute[225701]: 2026-01-23 10:22:56.402 225706 DEBUG oslo_concurrency.lockutils [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:22:56 compute-2 nova_compute[225701]: 2026-01-23 10:22:56.402 225706 DEBUG oslo_concurrency.lockutils [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:22:56 compute-2 nova_compute[225701]: 2026-01-23 10:22:56.402 225706 DEBUG oslo_concurrency.lockutils [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:22:56 compute-2 nova_compute[225701]: 2026-01-23 10:22:56.403 225706 DEBUG oslo_concurrency.lockutils [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:22:56 compute-2 nova_compute[225701]: 2026-01-23 10:22:56.405 225706 INFO nova.compute.manager [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Terminating instance
Jan 23 10:22:56 compute-2 nova_compute[225701]: 2026-01-23 10:22:56.406 225706 DEBUG nova.compute.manager [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 10:22:56 compute-2 kernel: tap06aeb511-67 (unregistering): left promiscuous mode
Jan 23 10:22:56 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Scheduled restart job, restart counter is at 13.
Jan 23 10:22:56 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:22:56 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.533s CPU time.
Jan 23 10:22:56 compute-2 NetworkManager[48964]: <info>  [1769163776.4793] device (tap06aeb511-67): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 10:22:56 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 10:22:56 compute-2 ovn_controller[132789]: 2026-01-23T10:22:56Z|00042|binding|INFO|Releasing lport 06aeb511-67a6-4547-b061-9c4760285e3b from this chassis (sb_readonly=0)
Jan 23 10:22:56 compute-2 nova_compute[225701]: 2026-01-23 10:22:56.493 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:56 compute-2 ovn_controller[132789]: 2026-01-23T10:22:56Z|00043|binding|INFO|Setting lport 06aeb511-67a6-4547-b061-9c4760285e3b down in Southbound
Jan 23 10:22:56 compute-2 ovn_controller[132789]: 2026-01-23T10:22:56Z|00044|binding|INFO|Removing iface tap06aeb511-67 ovn-installed in OVS
Jan 23 10:22:56 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.500 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:69:9c 10.100.0.25'], port_security=['fa:16:3e:39:69:9c 10.100.0.25'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': 'b8ea49c6-5f62-47b0-92cc-7399bfc98528', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a09a282-aa22-47cf-a68d-ce0dba493868', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a9d12c65-6e30-4f8d-be47-424dc8b73a1d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8dee54ab-ce3c-4b4e-ac76-15d1824a947d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>], logical_port=06aeb511-67a6-4547-b061-9c4760285e3b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:22:56 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.501 142606 INFO neutron.agent.ovn.metadata.agent [-] Port 06aeb511-67a6-4547-b061-9c4760285e3b in datapath 6a09a282-aa22-47cf-a68d-ce0dba493868 unbound from our chassis
Jan 23 10:22:56 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.502 142606 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6a09a282-aa22-47cf-a68d-ce0dba493868, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 10:22:56 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.503 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[f77d59dd-e119-4549-928d-e5600508c9a7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:56 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.504 142606 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868 namespace which is not needed anymore
Jan 23 10:22:56 compute-2 nova_compute[225701]: 2026-01-23 10:22:56.511 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:56 compute-2 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000007.scope: Deactivated successfully.
Jan 23 10:22:56 compute-2 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000007.scope: Consumed 13.750s CPU time.
Jan 23 10:22:56 compute-2 systemd-machined[194368]: Machine qemu-2-instance-00000007 terminated.
Jan 23 10:22:56 compute-2 nova_compute[225701]: 2026-01-23 10:22:56.657 225706 INFO nova.virt.libvirt.driver [-] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Instance destroyed successfully.
Jan 23 10:22:56 compute-2 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[232542]: [NOTICE]   (232562) : haproxy version is 2.8.14-c23fe91
Jan 23 10:22:56 compute-2 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[232542]: [NOTICE]   (232562) : path to executable is /usr/sbin/haproxy
Jan 23 10:22:56 compute-2 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[232542]: [WARNING]  (232562) : Exiting Master process...
Jan 23 10:22:56 compute-2 nova_compute[225701]: 2026-01-23 10:22:56.658 225706 DEBUG nova.objects.instance [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'resources' on Instance uuid b8ea49c6-5f62-47b0-92cc-7399bfc98528 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:22:56 compute-2 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[232542]: [WARNING]  (232562) : Exiting Master process...
Jan 23 10:22:56 compute-2 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[232542]: [ALERT]    (232562) : Current worker (232566) exited with code 143 (Terminated)
Jan 23 10:22:56 compute-2 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[232542]: [WARNING]  (232562) : All workers exited. Exiting... (0)
Jan 23 10:22:56 compute-2 systemd[1]: libpod-dc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a.scope: Deactivated successfully.
Jan 23 10:22:56 compute-2 podman[232897]: 2026-01-23 10:22:56.670533778 +0000 UTC m=+0.059751419 container died dc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:22:56 compute-2 nova_compute[225701]: 2026-01-23 10:22:56.674 225706 DEBUG nova.virt.libvirt.vif [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:22:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1878761076',display_name='tempest-TestNetworkBasicOps-server-1878761076',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1878761076',id=7,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCnVJwsrFV8aI2nrJoEZl6VyyMwYmX81xfzmKsfGpDRm0DGXIQaGmDmPINRbdeF1kx8Y5VA3JSgU3fPoWzBbPsDeXm0p5hq8BrMWr1cPqMrGzO08egHCDlwB5XDUgBL1OA==',key_name='tempest-TestNetworkBasicOps-1503023412',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:22:30Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-qvac9ktg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:22:30Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=b8ea49c6-5f62-47b0-92cc-7399bfc98528,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06aeb511-67a6-4547-b061-9c4760285e3b", "address": "fa:16:3e:39:69:9c", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06aeb511-67", "ovs_interfaceid": "06aeb511-67a6-4547-b061-9c4760285e3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 10:22:56 compute-2 nova_compute[225701]: 2026-01-23 10:22:56.675 225706 DEBUG nova.network.os_vif_util [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "06aeb511-67a6-4547-b061-9c4760285e3b", "address": "fa:16:3e:39:69:9c", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06aeb511-67", "ovs_interfaceid": "06aeb511-67a6-4547-b061-9c4760285e3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:22:56 compute-2 nova_compute[225701]: 2026-01-23 10:22:56.676 225706 DEBUG nova.network.os_vif_util [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:69:9c,bridge_name='br-int',has_traffic_filtering=True,id=06aeb511-67a6-4547-b061-9c4760285e3b,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06aeb511-67') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:22:56 compute-2 nova_compute[225701]: 2026-01-23 10:22:56.676 225706 DEBUG os_vif [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:69:9c,bridge_name='br-int',has_traffic_filtering=True,id=06aeb511-67a6-4547-b061-9c4760285e3b,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06aeb511-67') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 10:22:56 compute-2 nova_compute[225701]: 2026-01-23 10:22:56.680 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:56 compute-2 nova_compute[225701]: 2026-01-23 10:22:56.681 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06aeb511-67, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:22:56 compute-2 nova_compute[225701]: 2026-01-23 10:22:56.685 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:56 compute-2 nova_compute[225701]: 2026-01-23 10:22:56.687 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 10:22:56 compute-2 nova_compute[225701]: 2026-01-23 10:22:56.692 225706 INFO os_vif [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:69:9c,bridge_name='br-int',has_traffic_filtering=True,id=06aeb511-67a6-4547-b061-9c4760285e3b,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06aeb511-67')
Jan 23 10:22:56 compute-2 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a-userdata-shm.mount: Deactivated successfully.
Jan 23 10:22:56 compute-2 systemd[1]: var-lib-containers-storage-overlay-47e998d09fd3fbb881a05bc419ded091790e7655197f061ed433d1f7bd8e2ac3-merged.mount: Deactivated successfully.
Jan 23 10:22:56 compute-2 podman[232897]: 2026-01-23 10:22:56.716636803 +0000 UTC m=+0.105854434 container cleanup dc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 10:22:56 compute-2 systemd[1]: libpod-conmon-dc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a.scope: Deactivated successfully.
Jan 23 10:22:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:56.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:56 compute-2 podman[232964]: 2026-01-23 10:22:56.758494915 +0000 UTC m=+0.040291675 container create 00f7620c8686931b35559136526f1ddfd77324ee269c01ffbfe8698ca081684a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325)
Jan 23 10:22:56 compute-2 podman[232980]: 2026-01-23 10:22:56.793148711 +0000 UTC m=+0.050979666 container remove dc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 23 10:22:56 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c9bf5cbdb826cd487d0a518ff7649bfcd11428b56788cb34e05d1a88f76de1b/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 10:22:56 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c9bf5cbdb826cd487d0a518ff7649bfcd11428b56788cb34e05d1a88f76de1b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 10:22:56 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c9bf5cbdb826cd487d0a518ff7649bfcd11428b56788cb34e05d1a88f76de1b/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:22:56 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c9bf5cbdb826cd487d0a518ff7649bfcd11428b56788cb34e05d1a88f76de1b/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.tykohi-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:22:56 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.798 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[bc79a4f3-fca8-4b96-9cfc-b3808de9e1bf]: (4, ('Fri Jan 23 10:22:56 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868 (dc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a)\ndc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a\nFri Jan 23 10:22:56 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868 (dc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a)\ndc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:56 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.800 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[3cafc1bb-53cf-4ffc-8d0d-9a636a084a3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:56 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.801 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a09a282-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:22:56 compute-2 nova_compute[225701]: 2026-01-23 10:22:56.802 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:56 compute-2 kernel: tap6a09a282-a0: left promiscuous mode
Jan 23 10:22:56 compute-2 podman[232964]: 2026-01-23 10:22:56.812439482 +0000 UTC m=+0.094236272 container init 00f7620c8686931b35559136526f1ddfd77324ee269c01ffbfe8698ca081684a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 10:22:56 compute-2 nova_compute[225701]: 2026-01-23 10:22:56.820 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:56 compute-2 podman[232964]: 2026-01-23 10:22:56.82305101 +0000 UTC m=+0.104847770 container start 00f7620c8686931b35559136526f1ddfd77324ee269c01ffbfe8698ca081684a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_REF=squid)
Jan 23 10:22:56 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.823 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[ef75d7a4-3a7d-4505-9c6a-64fad7bf6b50]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:56 compute-2 bash[232964]: 00f7620c8686931b35559136526f1ddfd77324ee269c01ffbfe8698ca081684a
Jan 23 10:22:56 compute-2 podman[232964]: 2026-01-23 10:22:56.742452364 +0000 UTC m=+0.024249134 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 10:22:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:22:56 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 10:22:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:22:56 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 10:22:56 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:22:56 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.839 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[d5b3b907-b788-4c09-b943-2d4b7c040085]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:56 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.840 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[fd7e422f-64fc-4397-9419-d36eaf343fc8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:56 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.852 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[a90f14d8-e5f8-45d2-8953-8c22b7ada38b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490191, 'reachable_time': 26342, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233006, 'error': None, 'target': 'ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:56 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.855 142723 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 10:22:56 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.855 142723 DEBUG oslo.privsep.daemon [-] privsep: reply[8c1d1968-0a23-47a7-8032-6bff95743003]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:56 compute-2 systemd[1]: run-netns-ovnmeta\x2d6a09a282\x2daa22\x2d47cf\x2da68d\x2dce0dba493868.mount: Deactivated successfully.
Jan 23 10:22:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:22:56 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 10:22:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:22:56 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 10:22:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:22:56 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 10:22:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:22:56 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 10:22:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:22:56 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 10:22:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:22:56 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:22:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:22:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:57.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:22:57 compute-2 nova_compute[225701]: 2026-01-23 10:22:57.314 225706 INFO nova.virt.libvirt.driver [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Deleting instance files /var/lib/nova/instances/b8ea49c6-5f62-47b0-92cc-7399bfc98528_del
Jan 23 10:22:57 compute-2 nova_compute[225701]: 2026-01-23 10:22:57.315 225706 INFO nova.virt.libvirt.driver [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Deletion of /var/lib/nova/instances/b8ea49c6-5f62-47b0-92cc-7399bfc98528_del complete
Jan 23 10:22:57 compute-2 nova_compute[225701]: 2026-01-23 10:22:57.396 225706 DEBUG nova.compute.manager [req-08105b5e-3660-4f28-b52d-cc77f85a3848 req-7adeba5c-36c5-4aad-b4a4-8254cec91f7e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Received event network-vif-unplugged-06aeb511-67a6-4547-b061-9c4760285e3b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:22:57 compute-2 nova_compute[225701]: 2026-01-23 10:22:57.396 225706 DEBUG oslo_concurrency.lockutils [req-08105b5e-3660-4f28-b52d-cc77f85a3848 req-7adeba5c-36c5-4aad-b4a4-8254cec91f7e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:22:57 compute-2 nova_compute[225701]: 2026-01-23 10:22:57.397 225706 DEBUG oslo_concurrency.lockutils [req-08105b5e-3660-4f28-b52d-cc77f85a3848 req-7adeba5c-36c5-4aad-b4a4-8254cec91f7e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:22:57 compute-2 nova_compute[225701]: 2026-01-23 10:22:57.398 225706 DEBUG oslo_concurrency.lockutils [req-08105b5e-3660-4f28-b52d-cc77f85a3848 req-7adeba5c-36c5-4aad-b4a4-8254cec91f7e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:22:57 compute-2 nova_compute[225701]: 2026-01-23 10:22:57.398 225706 DEBUG nova.compute.manager [req-08105b5e-3660-4f28-b52d-cc77f85a3848 req-7adeba5c-36c5-4aad-b4a4-8254cec91f7e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] No waiting events found dispatching network-vif-unplugged-06aeb511-67a6-4547-b061-9c4760285e3b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:22:57 compute-2 nova_compute[225701]: 2026-01-23 10:22:57.399 225706 DEBUG nova.compute.manager [req-08105b5e-3660-4f28-b52d-cc77f85a3848 req-7adeba5c-36c5-4aad-b4a4-8254cec91f7e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Received event network-vif-unplugged-06aeb511-67a6-4547-b061-9c4760285e3b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 10:22:57 compute-2 nova_compute[225701]: 2026-01-23 10:22:57.400 225706 INFO nova.compute.manager [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Took 0.99 seconds to destroy the instance on the hypervisor.
Jan 23 10:22:57 compute-2 nova_compute[225701]: 2026-01-23 10:22:57.401 225706 DEBUG oslo.service.loopingcall [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 10:22:57 compute-2 nova_compute[225701]: 2026-01-23 10:22:57.402 225706 DEBUG nova.compute.manager [-] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 10:22:57 compute-2 nova_compute[225701]: 2026-01-23 10:22:57.402 225706 DEBUG nova.network.neutron [-] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 10:22:57 compute-2 ceph-mon[75771]: pgmap v890: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 253 KiB/s rd, 1.0 MiB/s wr, 35 op/s
Jan 23 10:22:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:57 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:22:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:58 compute-2 nova_compute[225701]: 2026-01-23 10:22:58.534 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:58.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:22:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:22:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:59.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:59 compute-2 nova_compute[225701]: 2026-01-23 10:22:59.481 225706 DEBUG nova.compute.manager [req-fb65a6f4-dc43-4684-9c29-9f866fbc0387 req-fe21e270-5e8d-4008-887e-8601acc93f1c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Received event network-vif-plugged-06aeb511-67a6-4547-b061-9c4760285e3b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:22:59 compute-2 nova_compute[225701]: 2026-01-23 10:22:59.481 225706 DEBUG oslo_concurrency.lockutils [req-fb65a6f4-dc43-4684-9c29-9f866fbc0387 req-fe21e270-5e8d-4008-887e-8601acc93f1c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:22:59 compute-2 nova_compute[225701]: 2026-01-23 10:22:59.482 225706 DEBUG oslo_concurrency.lockutils [req-fb65a6f4-dc43-4684-9c29-9f866fbc0387 req-fe21e270-5e8d-4008-887e-8601acc93f1c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:22:59 compute-2 nova_compute[225701]: 2026-01-23 10:22:59.482 225706 DEBUG oslo_concurrency.lockutils [req-fb65a6f4-dc43-4684-9c29-9f866fbc0387 req-fe21e270-5e8d-4008-887e-8601acc93f1c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:22:59 compute-2 nova_compute[225701]: 2026-01-23 10:22:59.482 225706 DEBUG nova.compute.manager [req-fb65a6f4-dc43-4684-9c29-9f866fbc0387 req-fe21e270-5e8d-4008-887e-8601acc93f1c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] No waiting events found dispatching network-vif-plugged-06aeb511-67a6-4547-b061-9c4760285e3b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:22:59 compute-2 nova_compute[225701]: 2026-01-23 10:22:59.482 225706 WARNING nova.compute.manager [req-fb65a6f4-dc43-4684-9c29-9f866fbc0387 req-fe21e270-5e8d-4008-887e-8601acc93f1c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Received unexpected event network-vif-plugged-06aeb511-67a6-4547-b061-9c4760285e3b for instance with vm_state active and task_state deleting.
Jan 23 10:22:59 compute-2 ceph-mon[75771]: pgmap v891: 353 pgs: 353 active+clean; 173 MiB data, 330 MiB used, 60 GiB / 60 GiB avail; 43 KiB/s rd, 18 KiB/s wr, 6 op/s
Jan 23 10:22:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:00 compute-2 nova_compute[225701]: 2026-01-23 10:23:00.481 225706 DEBUG nova.network.neutron [-] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:23:00 compute-2 nova_compute[225701]: 2026-01-23 10:23:00.502 225706 INFO nova.compute.manager [-] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Took 3.10 seconds to deallocate network for instance.
Jan 23 10:23:00 compute-2 nova_compute[225701]: 2026-01-23 10:23:00.541 225706 DEBUG oslo_concurrency.lockutils [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:23:00 compute-2 nova_compute[225701]: 2026-01-23 10:23:00.542 225706 DEBUG oslo_concurrency.lockutils [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:23:00 compute-2 nova_compute[225701]: 2026-01-23 10:23:00.549 225706 DEBUG nova.compute.manager [req-21b59a28-e133-4c8b-8f41-06fd329525d6 req-daa71949-13c6-4d8c-85b2-100e939a5b20 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Received event network-vif-deleted-06aeb511-67a6-4547-b061-9c4760285e3b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:23:00 compute-2 nova_compute[225701]: 2026-01-23 10:23:00.572 225706 DEBUG nova.scheduler.client.report [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Refreshing inventories for resource provider db762d15-510c-4120-bfc4-afe76b90b657 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 23 10:23:00 compute-2 nova_compute[225701]: 2026-01-23 10:23:00.586 225706 DEBUG nova.scheduler.client.report [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Updating ProviderTree inventory for provider db762d15-510c-4120-bfc4-afe76b90b657 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 23 10:23:00 compute-2 nova_compute[225701]: 2026-01-23 10:23:00.586 225706 DEBUG nova.compute.provider_tree [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Updating inventory in ProviderTree for provider db762d15-510c-4120-bfc4-afe76b90b657 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 10:23:00 compute-2 nova_compute[225701]: 2026-01-23 10:23:00.597 225706 DEBUG nova.scheduler.client.report [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Refreshing aggregate associations for resource provider db762d15-510c-4120-bfc4-afe76b90b657, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 23 10:23:00 compute-2 nova_compute[225701]: 2026-01-23 10:23:00.621 225706 DEBUG nova.scheduler.client.report [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Refreshing trait associations for resource provider db762d15-510c-4120-bfc4-afe76b90b657, traits: COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX2,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_F16C,HW_CPU_X86_ABM,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_BMI2,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SVM,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_FDC,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_CLMUL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSSE3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 23 10:23:00 compute-2 nova_compute[225701]: 2026-01-23 10:23:00.651 225706 DEBUG oslo_concurrency.processutils [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:23:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:00.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:00 compute-2 nova_compute[225701]: 2026-01-23 10:23:00.778 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:23:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:01 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:23:01 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2480691247' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:23:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:23:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:01.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:23:01 compute-2 nova_compute[225701]: 2026-01-23 10:23:01.151 225706 DEBUG oslo_concurrency.processutils [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:23:01 compute-2 nova_compute[225701]: 2026-01-23 10:23:01.156 225706 DEBUG nova.compute.provider_tree [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:23:01 compute-2 nova_compute[225701]: 2026-01-23 10:23:01.189 225706 DEBUG nova.scheduler.client.report [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:23:01 compute-2 nova_compute[225701]: 2026-01-23 10:23:01.221 225706 DEBUG oslo_concurrency.lockutils [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.679s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:23:01 compute-2 nova_compute[225701]: 2026-01-23 10:23:01.285 225706 INFO nova.scheduler.client.report [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Deleted allocations for instance b8ea49c6-5f62-47b0-92cc-7399bfc98528
Jan 23 10:23:01 compute-2 nova_compute[225701]: 2026-01-23 10:23:01.360 225706 DEBUG oslo_concurrency.lockutils [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.959s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:23:01 compute-2 nova_compute[225701]: 2026-01-23 10:23:01.685 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:01 compute-2 ceph-mon[75771]: pgmap v892: 353 pgs: 353 active+clean; 173 MiB data, 330 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 17 KiB/s wr, 3 op/s
Jan 23 10:23:01 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2480691247' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:23:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:02.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:02 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:23:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:02 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:23:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:02 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:23:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:03.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:03 compute-2 nova_compute[225701]: 2026-01-23 10:23:03.661 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:03 compute-2 nova_compute[225701]: 2026-01-23 10:23:03.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:23:03 compute-2 nova_compute[225701]: 2026-01-23 10:23:03.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:23:03 compute-2 nova_compute[225701]: 2026-01-23 10:23:03.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:23:03 compute-2 nova_compute[225701]: 2026-01-23 10:23:03.809 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:23:03 compute-2 nova_compute[225701]: 2026-01-23 10:23:03.810 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:23:03 compute-2 ceph-mon[75771]: pgmap v893: 353 pgs: 353 active+clean; 121 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 20 KiB/s wr, 30 op/s
Jan 23 10:23:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:23:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:04.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:23:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:04 compute-2 sudo[233075]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:23:04 compute-2 sudo[233075]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:23:04 compute-2 sudo[233075]: pam_unix(sudo:session): session closed for user root
Jan 23 10:23:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:05.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:05 compute-2 ceph-mon[75771]: pgmap v894: 353 pgs: 353 active+clean; 121 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 8.4 KiB/s wr, 30 op/s
Jan 23 10:23:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:23:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:06 compute-2 nova_compute[225701]: 2026-01-23 10:23:06.688 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:06.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:06 compute-2 nova_compute[225701]: 2026-01-23 10:23:06.805 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:23:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:07 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1596926985' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:23:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:23:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:07.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:23:07 compute-2 nova_compute[225701]: 2026-01-23 10:23:07.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:23:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:07 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:23:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:08 compute-2 ceph-mon[75771]: pgmap v895: 353 pgs: 353 active+clean; 121 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 8.8 KiB/s wr, 32 op/s
Jan 23 10:23:08 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2002394447' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:23:08 compute-2 nova_compute[225701]: 2026-01-23 10:23:08.261 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:23:08 compute-2 nova_compute[225701]: 2026-01-23 10:23:08.261 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:23:08 compute-2 nova_compute[225701]: 2026-01-23 10:23:08.262 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:23:08 compute-2 nova_compute[225701]: 2026-01-23 10:23:08.262 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:23:08 compute-2 nova_compute[225701]: 2026-01-23 10:23:08.263 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:23:08 compute-2 nova_compute[225701]: 2026-01-23 10:23:08.664 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:08 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:23:08 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/644210232' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:23:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:08.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:08 compute-2 nova_compute[225701]: 2026-01-23 10:23:08.752 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:23:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:08 compute-2 nova_compute[225701]: 2026-01-23 10:23:08.907 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:23:08 compute-2 nova_compute[225701]: 2026-01-23 10:23:08.908 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4856MB free_disk=59.942562103271484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:23:08 compute-2 nova_compute[225701]: 2026-01-23 10:23:08.908 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:23:08 compute-2 nova_compute[225701]: 2026-01-23 10:23:08.909 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:23:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:08 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:23:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:08 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 10:23:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:08 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 10:23:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:08 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 10:23:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:08 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 10:23:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:08 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 10:23:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:08 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 10:23:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:08 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:23:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:08 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:23:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:08 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:23:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 10:23:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:23:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 10:23:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 10:23:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 10:23:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 10:23:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 10:23:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 10:23:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 10:23:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 10:23:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 10:23:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 10:23:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 10:23:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 10:23:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:23:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 10:23:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:23:09 compute-2 nova_compute[225701]: 2026-01-23 10:23:09.063 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:09 compute-2 ceph-mon[75771]: pgmap v896: 353 pgs: 353 active+clean; 121 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 3.2 KiB/s wr, 31 op/s
Jan 23 10:23:09 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/644210232' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:23:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:23:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:09.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:23:09 compute-2 nova_compute[225701]: 2026-01-23 10:23:09.300 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:23:09 compute-2 nova_compute[225701]: 2026-01-23 10:23:09.300 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:23:09 compute-2 nova_compute[225701]: 2026-01-23 10:23:09.318 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:23:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbef0000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:09 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:23:09 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1432565687' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:23:09 compute-2 nova_compute[225701]: 2026-01-23 10:23:09.747 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:23:09 compute-2 nova_compute[225701]: 2026-01-23 10:23:09.752 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:23:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:10 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc0016c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:10 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1432565687' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:23:10 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1488791737' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:23:10 compute-2 nova_compute[225701]: 2026-01-23 10:23:10.632 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:23:10 compute-2 nova_compute[225701]: 2026-01-23 10:23:10.659 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:23:10 compute-2 nova_compute[225701]: 2026-01-23 10:23:10.659 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.751s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:23:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:10 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbecc000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:23:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:10.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:23:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:11.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:11 compute-2 ceph-mon[75771]: pgmap v897: 353 pgs: 353 active+clean; 121 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 3.2 KiB/s wr, 29 op/s
Jan 23 10:23:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:11 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:11 compute-2 nova_compute[225701]: 2026-01-23 10:23:11.655 225706 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163776.6524646, b8ea49c6-5f62-47b0-92cc-7399bfc98528 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:23:11 compute-2 nova_compute[225701]: 2026-01-23 10:23:11.655 225706 INFO nova.compute.manager [-] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] VM Stopped (Lifecycle Event)
Jan 23 10:23:11 compute-2 nova_compute[225701]: 2026-01-23 10:23:11.689 225706 DEBUG nova.compute.manager [None req-dd69c3ed-56ee-42a9-8f93-f3a4b28879e1 - - - - - -] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:23:11 compute-2 nova_compute[225701]: 2026-01-23 10:23:11.691 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:12 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/102312 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:23:12 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/721921800' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:23:12 compute-2 nova_compute[225701]: 2026-01-23 10:23:12.660 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:23:12 compute-2 nova_compute[225701]: 2026-01-23 10:23:12.661 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:23:12 compute-2 nova_compute[225701]: 2026-01-23 10:23:12.661 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:23:12 compute-2 nova_compute[225701]: 2026-01-23 10:23:12.662 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:23:12 compute-2 nova_compute[225701]: 2026-01-23 10:23:12.662 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:23:12 compute-2 nova_compute[225701]: 2026-01-23 10:23:12.662 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:23:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:12 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:12.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:12 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:23:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:13.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:13 compute-2 ceph-mon[75771]: pgmap v898: 353 pgs: 353 active+clean; 80 MiB data, 282 MiB used, 60 GiB / 60 GiB avail; 23 KiB/s rd, 5.2 KiB/s wr, 35 op/s
Jan 23 10:23:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:13 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbecc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:13 compute-2 nova_compute[225701]: 2026-01-23 10:23:13.666 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:14 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:14 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:14 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1573239966' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:23:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:23:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:14.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:23:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:15.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:15 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:15 compute-2 ceph-mon[75771]: pgmap v899: 353 pgs: 353 active+clean; 80 MiB data, 282 MiB used, 60 GiB / 60 GiB avail; 4.0 KiB/s rd, 2.4 KiB/s wr, 7 op/s
Jan 23 10:23:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:16 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbecc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:16 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:16 compute-2 nova_compute[225701]: 2026-01-23 10:23:16.694 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:23:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:16.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:23:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:17.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:17 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:17 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:23:17 compute-2 ceph-mon[75771]: pgmap v900: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 3.2 KiB/s wr, 30 op/s
Jan 23 10:23:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:18 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:18 compute-2 nova_compute[225701]: 2026-01-23 10:23:18.668 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:18 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbecc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:18.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:23:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:19.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:23:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:19 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:19 compute-2 ceph-mon[75771]: pgmap v901: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 2.8 KiB/s wr, 28 op/s
Jan 23 10:23:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:20 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:23:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:20 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:20.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:21.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:21 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbecc002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:21 compute-2 ceph-mon[75771]: pgmap v902: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 2.8 KiB/s wr, 28 op/s
Jan 23 10:23:21 compute-2 nova_compute[225701]: 2026-01-23 10:23:21.697 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:22 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:22 compute-2 podman[233177]: 2026-01-23 10:23:22.341592204 +0000 UTC m=+0.057224716 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 23 10:23:22 compute-2 podman[233176]: 2026-01-23 10:23:22.408822695 +0000 UTC m=+0.128955607 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 10:23:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:22 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:23:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:22.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:23:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:22 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:23:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:23.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:23 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:23 compute-2 nova_compute[225701]: 2026-01-23 10:23:23.670 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:23 compute-2 ceph-mon[75771]: pgmap v903: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 2.8 KiB/s wr, 28 op/s
Jan 23 10:23:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:24 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbecc002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:24 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:23:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:24.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:23:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:25 compute-2 sudo[233225]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:23:25 compute-2 sudo[233225]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:23:25 compute-2 sudo[233225]: pam_unix(sudo:session): session closed for user root
Jan 23 10:23:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:23:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:25.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:23:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:25 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:25 compute-2 ceph-mon[75771]: pgmap v904: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 16 KiB/s rd, 852 B/s wr, 22 op/s
Jan 23 10:23:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:26 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:26 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbecc002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:26 compute-2 nova_compute[225701]: 2026-01-23 10:23:26.701 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:23:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:26.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:23:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:23:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:27.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:23:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:27 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:27 compute-2 ceph-mon[75771]: pgmap v905: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 853 B/s wr, 22 op/s
Jan 23 10:23:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:27 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:23:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:28 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:28 compute-2 nova_compute[225701]: 2026-01-23 10:23:28.672 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:28 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:23:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:28.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:23:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:23:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:29.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:23:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:29 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:29 compute-2 ceph-mon[75771]: pgmap v906: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:23:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:30 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:30 compute-2 nova_compute[225701]: 2026-01-23 10:23:30.518 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:30 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:23:30.518 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:23:30 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:23:30.520 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:23:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:30 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:23:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:30.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:23:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:31.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:31 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:31 compute-2 nova_compute[225701]: 2026-01-23 10:23:31.703 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:32 compute-2 ceph-mon[75771]: pgmap v907: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:23:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:32 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:32 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:23:32.524 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:23:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:32 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:32.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:32 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:23:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:33 compute-2 ceph-mon[75771]: pgmap v908: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:23:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:23:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:33.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:23:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:33 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:33 compute-2 nova_compute[225701]: 2026-01-23 10:23:33.674 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:34 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:34 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/102334 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:23:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:23:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:34.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:23:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:35.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:35 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:35 compute-2 ceph-mon[75771]: pgmap v909: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:23:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:23:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:36 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:36 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:36 compute-2 nova_compute[225701]: 2026-01-23 10:23:36.705 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:23:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:36.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:23:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:23:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:37.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:23:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:37 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:38 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:23:38 compute-2 ceph-mon[75771]: pgmap v910: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:23:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:38 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:38 compute-2 nova_compute[225701]: 2026-01-23 10:23:38.674 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:38 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:38.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:39.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:39 compute-2 ceph-mon[75771]: pgmap v911: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:23:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:39 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:40 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:40 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:40.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:41.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:41 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:41 compute-2 nova_compute[225701]: 2026-01-23 10:23:41.760 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:41 compute-2 ceph-mon[75771]: pgmap v912: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:23:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.870980) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163821871203, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 1624, "num_deletes": 505, "total_data_size": 3294691, "memory_usage": 3352160, "flush_reason": "Manual Compaction"}
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163821883003, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 1383209, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28353, "largest_seqno": 29972, "table_properties": {"data_size": 1377972, "index_size": 2057, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2117, "raw_key_size": 16001, "raw_average_key_size": 19, "raw_value_size": 1364717, "raw_average_value_size": 1630, "num_data_blocks": 90, "num_entries": 837, "num_filter_entries": 837, "num_deletions": 505, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163707, "oldest_key_time": 1769163707, "file_creation_time": 1769163821, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 12055 microseconds, and 5686 cpu microseconds.
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.883096) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 1383209 bytes OK
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.883133) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.884942) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.884963) EVENT_LOG_v1 {"time_micros": 1769163821884959, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.884981) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 3286298, prev total WAL file size 3286298, number of live WAL files 2.
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.886084) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353130' seq:72057594037927935, type:22 .. '6C6F676D00373631' seq:0, type:0; will stop at (end)
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(1350KB)], [54(14MB)]
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163821886218, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 16235185, "oldest_snapshot_seqno": -1}
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 5783 keys, 12761257 bytes, temperature: kUnknown
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163821966194, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 12761257, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12723894, "index_size": 21829, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14469, "raw_key_size": 149269, "raw_average_key_size": 25, "raw_value_size": 12620398, "raw_average_value_size": 2182, "num_data_blocks": 877, "num_entries": 5783, "num_filter_entries": 5783, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769163821, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.966564) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 12761257 bytes
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.968263) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 202.7 rd, 159.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 14.2 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(21.0) write-amplify(9.2) OK, records in: 6761, records dropped: 978 output_compression: NoCompression
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.968288) EVENT_LOG_v1 {"time_micros": 1769163821968276, "job": 32, "event": "compaction_finished", "compaction_time_micros": 80083, "compaction_time_cpu_micros": 30584, "output_level": 6, "num_output_files": 1, "total_output_size": 12761257, "num_input_records": 6761, "num_output_records": 5783, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163821968792, "job": 32, "event": "table_file_deletion", "file_number": 56}
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163821972346, "job": 32, "event": "table_file_deletion", "file_number": 54}
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.885886) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.972409) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.972414) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.972416) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.972418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:23:41 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.972420) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:23:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:42 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec8000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:42 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:23:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:42 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:42.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:42 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2132895342' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:23:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:43 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:23:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:43.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:43 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:43 compute-2 nova_compute[225701]: 2026-01-23 10:23:43.676 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:43 compute-2 ceph-mon[75771]: pgmap v913: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:23:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:44 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:44 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:23:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:44.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:23:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:45 compute-2 sudo[233275]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:23:45 compute-2 sudo[233275]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:23:45 compute-2 sudo[233275]: pam_unix(sudo:session): session closed for user root
Jan 23 10:23:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:45.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:45 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:45 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:23:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:45 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:23:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:45 compute-2 ceph-mon[75771]: pgmap v914: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:23:45 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3313488858' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:23:45 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1984605226' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:23:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:46 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:46 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:46 compute-2 nova_compute[225701]: 2026-01-23 10:23:46.763 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:23:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:46.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:23:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:47.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:47 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:47 compute-2 ceph-mon[75771]: pgmap v915: 353 pgs: 353 active+clean; 68 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 3.1 KiB/s rd, 1.1 MiB/s wr, 6 op/s
Jan 23 10:23:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:48 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:23:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:48 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:48 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:23:48 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 23 10:23:48 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1571219015' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:23:48 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 23 10:23:48 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1571219015' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:23:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:48 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:48 compute-2 nova_compute[225701]: 2026-01-23 10:23:48.738 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:23:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:48.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:23:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:48 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/1571219015' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:23:48 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/1571219015' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:23:48 compute-2 ceph-mon[75771]: pgmap v916: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Jan 23 10:23:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:49.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:49 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:49 compute-2 sudo[233304]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:23:49 compute-2 sudo[233304]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:23:49 compute-2 sudo[233304]: pam_unix(sudo:session): session closed for user root
Jan 23 10:23:49 compute-2 sudo[233329]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:23:49 compute-2 sudo[233329]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:23:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:23:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:50 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:50 compute-2 sudo[233329]: pam_unix(sudo:session): session closed for user root
Jan 23 10:23:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:50 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:50.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:51 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:23:51 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:23:51 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:23:51 compute-2 ceph-mon[75771]: pgmap v917: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Jan 23 10:23:51 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:23:51 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:23:51 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:23:51 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:23:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:51.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:51 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:51 compute-2 nova_compute[225701]: 2026-01-23 10:23:51.766 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:52 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:52 compute-2 podman[233389]: 2026-01-23 10:23:52.662595896 +0000 UTC m=+0.072740857 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 10:23:52 compute-2 podman[233388]: 2026-01-23 10:23:52.698709302 +0000 UTC m=+0.107646074 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 23 10:23:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:52 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:52.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:53 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:23:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:23:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:53.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:23:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:53 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:53 compute-2 nova_compute[225701]: 2026-01-23 10:23:53.742 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:53 compute-2 ceph-mon[75771]: pgmap v918: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 104 op/s
Jan 23 10:23:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:54 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:54 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/102354 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:23:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:23:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:54.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:23:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:55.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:55 compute-2 sudo[233435]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:23:55 compute-2 sudo[233435]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:23:55 compute-2 sudo[233435]: pam_unix(sudo:session): session closed for user root
Jan 23 10:23:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:23:55.491 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:23:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:23:55.491 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:23:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:23:55.492 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:23:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:55 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:55 compute-2 ovn_controller[132789]: 2026-01-23T10:23:55Z|00045|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 23 10:23:55 compute-2 ceph-mon[75771]: pgmap v919: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 103 op/s
Jan 23 10:23:55 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:23:55 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:23:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:56 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:56 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:56 compute-2 nova_compute[225701]: 2026-01-23 10:23:56.770 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:56 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2411302437' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:23:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:56.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:57.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:57 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:57 compute-2 ceph-mon[75771]: pgmap v920: 353 pgs: 353 active+clean; 61 MiB data, 268 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 109 op/s
Jan 23 10:23:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:58 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:23:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:58 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:58 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:58 compute-2 nova_compute[225701]: 2026-01-23 10:23:58.744 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:58.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:23:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:23:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:59.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:59 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:59 compute-2 ceph-mon[75771]: pgmap v921: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 737 KiB/s wr, 123 op/s
Jan 23 10:23:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:00 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:00 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:00 compute-2 nova_compute[225701]: 2026-01-23 10:24:00.780 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:24:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.003000072s ======
Jan 23 10:24:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:00.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000072s
Jan 23 10:24:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:24:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:01.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:24:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:01 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:01 compute-2 nova_compute[225701]: 2026-01-23 10:24:01.773 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:01 compute-2 ceph-mon[75771]: pgmap v922: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.5 KiB/s wr, 92 op/s
Jan 23 10:24:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:02 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:02 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:02.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:03 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:24:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:03.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:03 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:03 compute-2 nova_compute[225701]: 2026-01-23 10:24:03.747 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:03 compute-2 nova_compute[225701]: 2026-01-23 10:24:03.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:24:03 compute-2 nova_compute[225701]: 2026-01-23 10:24:03.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:24:03 compute-2 nova_compute[225701]: 2026-01-23 10:24:03.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:24:03 compute-2 nova_compute[225701]: 2026-01-23 10:24:03.802 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:24:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:03 compute-2 ceph-mon[75771]: pgmap v923: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.5 KiB/s wr, 93 op/s
Jan 23 10:24:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:04 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:04 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:04.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:05 compute-2 sudo[233470]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:24:05 compute-2 sudo[233470]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:24:05 compute-2 sudo[233470]: pam_unix(sudo:session): session closed for user root
Jan 23 10:24:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:05.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:05 compute-2 nova_compute[225701]: 2026-01-23 10:24:05.349 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "bae2b00f-87e8-40b7-b7ba-972f7c531998" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:24:05 compute-2 nova_compute[225701]: 2026-01-23 10:24:05.349 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:24:05 compute-2 nova_compute[225701]: 2026-01-23 10:24:05.377 225706 DEBUG nova.compute.manager [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 10:24:05 compute-2 nova_compute[225701]: 2026-01-23 10:24:05.485 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:24:05 compute-2 nova_compute[225701]: 2026-01-23 10:24:05.486 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:24:05 compute-2 nova_compute[225701]: 2026-01-23 10:24:05.494 225706 DEBUG nova.virt.hardware [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 10:24:05 compute-2 nova_compute[225701]: 2026-01-23 10:24:05.494 225706 INFO nova.compute.claims [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Claim successful on node compute-2.ctlplane.example.com
Jan 23 10:24:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:05 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:05 compute-2 nova_compute[225701]: 2026-01-23 10:24:05.630 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:24:05 compute-2 nova_compute[225701]: 2026-01-23 10:24:05.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:24:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:05 compute-2 ceph-mon[75771]: pgmap v924: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 26 op/s
Jan 23 10:24:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:24:06 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:24:06 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3596100017' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:24:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:06 compute-2 nova_compute[225701]: 2026-01-23 10:24:06.088 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:24:06 compute-2 nova_compute[225701]: 2026-01-23 10:24:06.096 225706 DEBUG nova.compute.provider_tree [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:24:06 compute-2 nova_compute[225701]: 2026-01-23 10:24:06.119 225706 DEBUG nova.scheduler.client.report [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:24:06 compute-2 nova_compute[225701]: 2026-01-23 10:24:06.138 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.652s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:24:06 compute-2 nova_compute[225701]: 2026-01-23 10:24:06.139 225706 DEBUG nova.compute.manager [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 10:24:06 compute-2 nova_compute[225701]: 2026-01-23 10:24:06.256 225706 DEBUG nova.compute.manager [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 10:24:06 compute-2 nova_compute[225701]: 2026-01-23 10:24:06.257 225706 DEBUG nova.network.neutron [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 10:24:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:06 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:06 compute-2 nova_compute[225701]: 2026-01-23 10:24:06.316 225706 INFO nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 10:24:06 compute-2 nova_compute[225701]: 2026-01-23 10:24:06.339 225706 DEBUG nova.compute.manager [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 10:24:06 compute-2 nova_compute[225701]: 2026-01-23 10:24:06.434 225706 DEBUG nova.compute.manager [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 10:24:06 compute-2 nova_compute[225701]: 2026-01-23 10:24:06.435 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 10:24:06 compute-2 nova_compute[225701]: 2026-01-23 10:24:06.435 225706 INFO nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Creating image(s)
Jan 23 10:24:06 compute-2 nova_compute[225701]: 2026-01-23 10:24:06.459 225706 DEBUG nova.storage.rbd_utils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image bae2b00f-87e8-40b7-b7ba-972f7c531998_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:24:06 compute-2 nova_compute[225701]: 2026-01-23 10:24:06.481 225706 DEBUG nova.storage.rbd_utils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image bae2b00f-87e8-40b7-b7ba-972f7c531998_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:24:06 compute-2 nova_compute[225701]: 2026-01-23 10:24:06.503 225706 DEBUG nova.storage.rbd_utils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image bae2b00f-87e8-40b7-b7ba-972f7c531998_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:24:06 compute-2 nova_compute[225701]: 2026-01-23 10:24:06.507 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:24:06 compute-2 nova_compute[225701]: 2026-01-23 10:24:06.561 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:24:06 compute-2 nova_compute[225701]: 2026-01-23 10:24:06.562 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "379b2821245bc82aa5a95839eddb9a97716b559c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:24:06 compute-2 nova_compute[225701]: 2026-01-23 10:24:06.562 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:24:06 compute-2 nova_compute[225701]: 2026-01-23 10:24:06.562 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:24:06 compute-2 nova_compute[225701]: 2026-01-23 10:24:06.582 225706 DEBUG nova.storage.rbd_utils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image bae2b00f-87e8-40b7-b7ba-972f7c531998_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:24:06 compute-2 nova_compute[225701]: 2026-01-23 10:24:06.586 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c bae2b00f-87e8-40b7-b7ba-972f7c531998_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:24:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:06 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:06 compute-2 nova_compute[225701]: 2026-01-23 10:24:06.769 225706 DEBUG nova.policy [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f459c4e71e6c47acb0f8aaf83f34695e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 10:24:06 compute-2 nova_compute[225701]: 2026-01-23 10:24:06.775 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:06.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:07 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3596100017' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:24:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:07.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:07 compute-2 nova_compute[225701]: 2026-01-23 10:24:07.386 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c bae2b00f-87e8-40b7-b7ba-972f7c531998_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.800s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:24:07 compute-2 nova_compute[225701]: 2026-01-23 10:24:07.465 225706 DEBUG nova.storage.rbd_utils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] resizing rbd image bae2b00f-87e8-40b7-b7ba-972f7c531998_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Jan 23 10:24:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:07 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:07 compute-2 nova_compute[225701]: 2026-01-23 10:24:07.570 225706 DEBUG nova.objects.instance [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'migration_context' on Instance uuid bae2b00f-87e8-40b7-b7ba-972f7c531998 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:24:07 compute-2 nova_compute[225701]: 2026-01-23 10:24:07.590 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 10:24:07 compute-2 nova_compute[225701]: 2026-01-23 10:24:07.591 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Ensure instance console log exists: /var/lib/nova/instances/bae2b00f-87e8-40b7-b7ba-972f7c531998/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 10:24:07 compute-2 nova_compute[225701]: 2026-01-23 10:24:07.592 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:24:07 compute-2 nova_compute[225701]: 2026-01-23 10:24:07.592 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:24:07 compute-2 nova_compute[225701]: 2026-01-23 10:24:07.593 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:24:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:08 compute-2 ceph-mon[75771]: pgmap v925: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 26 op/s
Jan 23 10:24:08 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1334627562' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:24:08 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1749144063' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:24:08 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:24:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:08 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:08 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:08 compute-2 nova_compute[225701]: 2026-01-23 10:24:08.749 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:24:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:08.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:24:09 compute-2 nova_compute[225701]: 2026-01-23 10:24:09.055 225706 DEBUG nova.network.neutron [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Successfully updated port: d744a552-c706-444a-8a15-4a98c41eed50 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 10:24:09 compute-2 nova_compute[225701]: 2026-01-23 10:24:09.075 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "refresh_cache-bae2b00f-87e8-40b7-b7ba-972f7c531998" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:24:09 compute-2 nova_compute[225701]: 2026-01-23 10:24:09.075 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquired lock "refresh_cache-bae2b00f-87e8-40b7-b7ba-972f7c531998" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:24:09 compute-2 nova_compute[225701]: 2026-01-23 10:24:09.075 225706 DEBUG nova.network.neutron [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 10:24:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:09 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1478679152' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:24:09 compute-2 ceph-mon[75771]: pgmap v926: 353 pgs: 353 active+clean; 69 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 26 KiB/s rd, 1.4 MiB/s wr, 34 op/s
Jan 23 10:24:09 compute-2 nova_compute[225701]: 2026-01-23 10:24:09.172 225706 DEBUG nova.compute.manager [req-18cca381-cee9-4466-8a0e-9fc59ae8e7c8 req-304d948e-3dea-4dbb-bd63-4985b25c0a55 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Received event network-changed-d744a552-c706-444a-8a15-4a98c41eed50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:24:09 compute-2 nova_compute[225701]: 2026-01-23 10:24:09.172 225706 DEBUG nova.compute.manager [req-18cca381-cee9-4466-8a0e-9fc59ae8e7c8 req-304d948e-3dea-4dbb-bd63-4985b25c0a55 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Refreshing instance network info cache due to event network-changed-d744a552-c706-444a-8a15-4a98c41eed50. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 10:24:09 compute-2 nova_compute[225701]: 2026-01-23 10:24:09.172 225706 DEBUG oslo_concurrency.lockutils [req-18cca381-cee9-4466-8a0e-9fc59ae8e7c8 req-304d948e-3dea-4dbb-bd63-4985b25c0a55 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-bae2b00f-87e8-40b7-b7ba-972f7c531998" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:24:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:09.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:09 compute-2 nova_compute[225701]: 2026-01-23 10:24:09.576 225706 DEBUG nova.network.neutron [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 10:24:09 compute-2 nova_compute[225701]: 2026-01-23 10:24:09.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:24:09 compute-2 nova_compute[225701]: 2026-01-23 10:24:09.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:24:09 compute-2 nova_compute[225701]: 2026-01-23 10:24:09.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:24:09 compute-2 nova_compute[225701]: 2026-01-23 10:24:09.807 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:24:09 compute-2 nova_compute[225701]: 2026-01-23 10:24:09.808 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:24:09 compute-2 nova_compute[225701]: 2026-01-23 10:24:09.808 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:24:09 compute-2 nova_compute[225701]: 2026-01-23 10:24:09.808 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:24:09 compute-2 nova_compute[225701]: 2026-01-23 10:24:09.808 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:24:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:10 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3295464724' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:24:10 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:24:10 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2577585059' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.243 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:24:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:10 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy ignored for local
Jan 23 10:24:10 compute-2 kernel: ganesha.nfsd[233132]: segfault at 50 ip 00007fbf73aa632e sp 00007fbef8ff8210 error 4 in libntirpc.so.5.8[7fbf73a8b000+2c000] likely on CPU 7 (core 0, socket 7)
Jan 23 10:24:10 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 10:24:10 compute-2 systemd[1]: Started Process Core Dump (PID 233709/UID 0).
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.393 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.394 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4884MB free_disk=59.97146224975586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.394 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.394 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.462 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Instance bae2b00f-87e8-40b7-b7ba-972f7c531998 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.463 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.463 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.513 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.534 225706 DEBUG nova.network.neutron [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Updating instance_info_cache with network_info: [{"id": "d744a552-c706-444a-8a15-4a98c41eed50", "address": "fa:16:3e:9f:48:6d", "network": {"id": "2fb57e44-e877-47c8-860b-b36d5b5ff599", "bridge": "br-int", "label": "tempest-network-smoke--2143346610", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd744a552-c7", "ovs_interfaceid": "d744a552-c706-444a-8a15-4a98c41eed50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.558 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Releasing lock "refresh_cache-bae2b00f-87e8-40b7-b7ba-972f7c531998" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.559 225706 DEBUG nova.compute.manager [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Instance network_info: |[{"id": "d744a552-c706-444a-8a15-4a98c41eed50", "address": "fa:16:3e:9f:48:6d", "network": {"id": "2fb57e44-e877-47c8-860b-b36d5b5ff599", "bridge": "br-int", "label": "tempest-network-smoke--2143346610", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd744a552-c7", "ovs_interfaceid": "d744a552-c706-444a-8a15-4a98c41eed50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.559 225706 DEBUG oslo_concurrency.lockutils [req-18cca381-cee9-4466-8a0e-9fc59ae8e7c8 req-304d948e-3dea-4dbb-bd63-4985b25c0a55 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-bae2b00f-87e8-40b7-b7ba-972f7c531998" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.559 225706 DEBUG nova.network.neutron [req-18cca381-cee9-4466-8a0e-9fc59ae8e7c8 req-304d948e-3dea-4dbb-bd63-4985b25c0a55 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Refreshing network info cache for port d744a552-c706-444a-8a15-4a98c41eed50 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.563 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Start _get_guest_xml network_info=[{"id": "d744a552-c706-444a-8a15-4a98c41eed50", "address": "fa:16:3e:9f:48:6d", "network": {"id": "2fb57e44-e877-47c8-860b-b36d5b5ff599", "bridge": "br-int", "label": "tempest-network-smoke--2143346610", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd744a552-c7", "ovs_interfaceid": "d744a552-c706-444a-8a15-4a98c41eed50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_options': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '271ec98e-d058-421b-bbfb-4b4a5954c90a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.568 225706 WARNING nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.573 225706 DEBUG nova.virt.libvirt.host [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.574 225706 DEBUG nova.virt.libvirt.host [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.577 225706 DEBUG nova.virt.libvirt.host [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.577 225706 DEBUG nova.virt.libvirt.host [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.578 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.578 225706 DEBUG nova.virt.hardware [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T10:15:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1d8c8bf4-786e-4009-bc53-f259480fb5b3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.579 225706 DEBUG nova.virt.hardware [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.579 225706 DEBUG nova.virt.hardware [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.579 225706 DEBUG nova.virt.hardware [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.580 225706 DEBUG nova.virt.hardware [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.580 225706 DEBUG nova.virt.hardware [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.580 225706 DEBUG nova.virt.hardware [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.581 225706 DEBUG nova.virt.hardware [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.581 225706 DEBUG nova.virt.hardware [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.581 225706 DEBUG nova.virt.hardware [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.581 225706 DEBUG nova.virt.hardware [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.586 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:24:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:10.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:10 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:24:10 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/287250540' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.976 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.981 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:24:10 compute-2 nova_compute[225701]: 2026-01-23 10:24:10.999 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:24:11 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 10:24:11 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/308211205' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.029 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.030 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.635s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.030 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.061 225706 DEBUG nova.storage.rbd_utils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image bae2b00f-87e8-40b7-b7ba-972f7c531998_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.065 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:24:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:11 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2577585059' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:24:11 compute-2 ceph-mon[75771]: pgmap v927: 353 pgs: 353 active+clean; 69 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 10 KiB/s rd, 1.4 MiB/s wr, 14 op/s
Jan 23 10:24:11 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/287250540' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:24:11 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/308211205' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:24:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:24:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:11.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:24:11 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 10:24:11 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/469737160' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:24:11 compute-2 systemd-coredump[233710]: Process 233002 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 45:
                                                    #0  0x00007fbf73aa632e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.521 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.523 225706 DEBUG nova.virt.libvirt.vif [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:24:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1790835147',display_name='tempest-TestNetworkBasicOps-server-1790835147',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1790835147',id=9,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLwh/ci1qy20vB5FyaBepDv6KpYIxs8h6oo7gGlHu7RZtK7kr5mjuHzqdrX+yDa6v1DJrzMXWjaBuQGyTdeFGY8MLFkkRTd0XB8VJHoKHx7kcuI7EyiJu2dhMv2/NI1ZTg==',key_name='tempest-TestNetworkBasicOps-520442326',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-ox8kizyd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:24:06Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=bae2b00f-87e8-40b7-b7ba-972f7c531998,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d744a552-c706-444a-8a15-4a98c41eed50", "address": "fa:16:3e:9f:48:6d", "network": {"id": "2fb57e44-e877-47c8-860b-b36d5b5ff599", "bridge": "br-int", "label": "tempest-network-smoke--2143346610", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd744a552-c7", "ovs_interfaceid": "d744a552-c706-444a-8a15-4a98c41eed50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.523 225706 DEBUG nova.network.os_vif_util [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "d744a552-c706-444a-8a15-4a98c41eed50", "address": "fa:16:3e:9f:48:6d", "network": {"id": "2fb57e44-e877-47c8-860b-b36d5b5ff599", "bridge": "br-int", "label": "tempest-network-smoke--2143346610", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd744a552-c7", "ovs_interfaceid": "d744a552-c706-444a-8a15-4a98c41eed50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.524 225706 DEBUG nova.network.os_vif_util [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:48:6d,bridge_name='br-int',has_traffic_filtering=True,id=d744a552-c706-444a-8a15-4a98c41eed50,network=Network(2fb57e44-e877-47c8-860b-b36d5b5ff599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd744a552-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.526 225706 DEBUG nova.objects.instance [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'pci_devices' on Instance uuid bae2b00f-87e8-40b7-b7ba-972f7c531998 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:24:11 compute-2 systemd[1]: systemd-coredump@13-233709-0.service: Deactivated successfully.
Jan 23 10:24:11 compute-2 systemd[1]: systemd-coredump@13-233709-0.service: Consumed 1.197s CPU time.
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.639 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] End _get_guest_xml xml=<domain type="kvm">
Jan 23 10:24:11 compute-2 nova_compute[225701]:   <uuid>bae2b00f-87e8-40b7-b7ba-972f7c531998</uuid>
Jan 23 10:24:11 compute-2 nova_compute[225701]:   <name>instance-00000009</name>
Jan 23 10:24:11 compute-2 nova_compute[225701]:   <memory>131072</memory>
Jan 23 10:24:11 compute-2 nova_compute[225701]:   <vcpu>1</vcpu>
Jan 23 10:24:11 compute-2 nova_compute[225701]:   <metadata>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <nova:name>tempest-TestNetworkBasicOps-server-1790835147</nova:name>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <nova:creationTime>2026-01-23 10:24:10</nova:creationTime>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <nova:flavor name="m1.nano">
Jan 23 10:24:11 compute-2 nova_compute[225701]:         <nova:memory>128</nova:memory>
Jan 23 10:24:11 compute-2 nova_compute[225701]:         <nova:disk>1</nova:disk>
Jan 23 10:24:11 compute-2 nova_compute[225701]:         <nova:swap>0</nova:swap>
Jan 23 10:24:11 compute-2 nova_compute[225701]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 10:24:11 compute-2 nova_compute[225701]:         <nova:vcpus>1</nova:vcpus>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       </nova:flavor>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <nova:owner>
Jan 23 10:24:11 compute-2 nova_compute[225701]:         <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 10:24:11 compute-2 nova_compute[225701]:         <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       </nova:owner>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <nova:ports>
Jan 23 10:24:11 compute-2 nova_compute[225701]:         <nova:port uuid="d744a552-c706-444a-8a15-4a98c41eed50">
Jan 23 10:24:11 compute-2 nova_compute[225701]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:         </nova:port>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       </nova:ports>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     </nova:instance>
Jan 23 10:24:11 compute-2 nova_compute[225701]:   </metadata>
Jan 23 10:24:11 compute-2 nova_compute[225701]:   <sysinfo type="smbios">
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <system>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <entry name="manufacturer">RDO</entry>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <entry name="product">OpenStack Compute</entry>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <entry name="serial">bae2b00f-87e8-40b7-b7ba-972f7c531998</entry>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <entry name="uuid">bae2b00f-87e8-40b7-b7ba-972f7c531998</entry>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <entry name="family">Virtual Machine</entry>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     </system>
Jan 23 10:24:11 compute-2 nova_compute[225701]:   </sysinfo>
Jan 23 10:24:11 compute-2 nova_compute[225701]:   <os>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <boot dev="hd"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <smbios mode="sysinfo"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:   </os>
Jan 23 10:24:11 compute-2 nova_compute[225701]:   <features>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <acpi/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <apic/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <vmcoreinfo/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:   </features>
Jan 23 10:24:11 compute-2 nova_compute[225701]:   <clock offset="utc">
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <timer name="hpet" present="no"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:   </clock>
Jan 23 10:24:11 compute-2 nova_compute[225701]:   <cpu mode="host-model" match="exact">
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:   </cpu>
Jan 23 10:24:11 compute-2 nova_compute[225701]:   <devices>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <disk type="network" device="disk">
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <driver type="raw" cache="none"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <source protocol="rbd" name="vms/bae2b00f-87e8-40b7-b7ba-972f7c531998_disk">
Jan 23 10:24:11 compute-2 nova_compute[225701]:         <host name="192.168.122.100" port="6789"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:         <host name="192.168.122.102" port="6789"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:         <host name="192.168.122.101" port="6789"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       </source>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <auth username="openstack">
Jan 23 10:24:11 compute-2 nova_compute[225701]:         <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       </auth>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <target dev="vda" bus="virtio"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     </disk>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <disk type="network" device="cdrom">
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <driver type="raw" cache="none"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <source protocol="rbd" name="vms/bae2b00f-87e8-40b7-b7ba-972f7c531998_disk.config">
Jan 23 10:24:11 compute-2 nova_compute[225701]:         <host name="192.168.122.100" port="6789"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:         <host name="192.168.122.102" port="6789"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:         <host name="192.168.122.101" port="6789"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       </source>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <auth username="openstack">
Jan 23 10:24:11 compute-2 nova_compute[225701]:         <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       </auth>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <target dev="sda" bus="sata"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     </disk>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <interface type="ethernet">
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <mac address="fa:16:3e:9f:48:6d"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <model type="virtio"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <mtu size="1442"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <target dev="tapd744a552-c7"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     </interface>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <serial type="pty">
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <log file="/var/lib/nova/instances/bae2b00f-87e8-40b7-b7ba-972f7c531998/console.log" append="off"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     </serial>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <video>
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <model type="virtio"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     </video>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <input type="tablet" bus="usb"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <rng model="virtio">
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <backend model="random">/dev/urandom</backend>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     </rng>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <controller type="usb" index="0"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     <memballoon model="virtio">
Jan 23 10:24:11 compute-2 nova_compute[225701]:       <stats period="10"/>
Jan 23 10:24:11 compute-2 nova_compute[225701]:     </memballoon>
Jan 23 10:24:11 compute-2 nova_compute[225701]:   </devices>
Jan 23 10:24:11 compute-2 nova_compute[225701]: </domain>
Jan 23 10:24:11 compute-2 nova_compute[225701]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.639 225706 DEBUG nova.compute.manager [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Preparing to wait for external event network-vif-plugged-d744a552-c706-444a-8a15-4a98c41eed50 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.640 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.640 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.640 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.641 225706 DEBUG nova.virt.libvirt.vif [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:24:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1790835147',display_name='tempest-TestNetworkBasicOps-server-1790835147',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1790835147',id=9,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLwh/ci1qy20vB5FyaBepDv6KpYIxs8h6oo7gGlHu7RZtK7kr5mjuHzqdrX+yDa6v1DJrzMXWjaBuQGyTdeFGY8MLFkkRTd0XB8VJHoKHx7kcuI7EyiJu2dhMv2/NI1ZTg==',key_name='tempest-TestNetworkBasicOps-520442326',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-ox8kizyd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:24:06Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=bae2b00f-87e8-40b7-b7ba-972f7c531998,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d744a552-c706-444a-8a15-4a98c41eed50", "address": "fa:16:3e:9f:48:6d", "network": {"id": "2fb57e44-e877-47c8-860b-b36d5b5ff599", "bridge": "br-int", "label": "tempest-network-smoke--2143346610", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd744a552-c7", "ovs_interfaceid": "d744a552-c706-444a-8a15-4a98c41eed50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.641 225706 DEBUG nova.network.os_vif_util [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "d744a552-c706-444a-8a15-4a98c41eed50", "address": "fa:16:3e:9f:48:6d", "network": {"id": "2fb57e44-e877-47c8-860b-b36d5b5ff599", "bridge": "br-int", "label": "tempest-network-smoke--2143346610", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd744a552-c7", "ovs_interfaceid": "d744a552-c706-444a-8a15-4a98c41eed50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.642 225706 DEBUG nova.network.os_vif_util [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:48:6d,bridge_name='br-int',has_traffic_filtering=True,id=d744a552-c706-444a-8a15-4a98c41eed50,network=Network(2fb57e44-e877-47c8-860b-b36d5b5ff599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd744a552-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.642 225706 DEBUG os_vif [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:48:6d,bridge_name='br-int',has_traffic_filtering=True,id=d744a552-c706-444a-8a15-4a98c41eed50,network=Network(2fb57e44-e877-47c8-860b-b36d5b5ff599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd744a552-c7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.643 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.643 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.644 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.648 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.649 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd744a552-c7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.650 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd744a552-c7, col_values=(('external_ids', {'iface-id': 'd744a552-c706-444a-8a15-4a98c41eed50', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9f:48:6d', 'vm-uuid': 'bae2b00f-87e8-40b7-b7ba-972f7c531998'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:24:11 compute-2 NetworkManager[48964]: <info>  [1769163851.6541] manager: (tapd744a552-c7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.654 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.656 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.660 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:11 compute-2 podman[233801]: 2026-01-23 10:24:11.662471141 +0000 UTC m=+0.031354361 container died 00f7620c8686931b35559136526f1ddfd77324ee269c01ffbfe8698ca081684a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.661 225706 INFO os_vif [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:48:6d,bridge_name='br-int',has_traffic_filtering=True,id=d744a552-c706-444a-8a15-4a98c41eed50,network=Network(2fb57e44-e877-47c8-860b-b36d5b5ff599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd744a552-c7')
Jan 23 10:24:11 compute-2 systemd[1]: var-lib-containers-storage-overlay-5c9bf5cbdb826cd487d0a518ff7649bfcd11428b56788cb34e05d1a88f76de1b-merged.mount: Deactivated successfully.
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.740 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.741 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.741 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No VIF found with MAC fa:16:3e:9f:48:6d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.741 225706 INFO nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Using config drive
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.767 225706 DEBUG nova.storage.rbd_utils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image bae2b00f-87e8-40b7-b7ba-972f7c531998_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:24:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.934 225706 DEBUG nova.network.neutron [req-18cca381-cee9-4466-8a0e-9fc59ae8e7c8 req-304d948e-3dea-4dbb-bd63-4985b25c0a55 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Updated VIF entry in instance network info cache for port d744a552-c706-444a-8a15-4a98c41eed50. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.935 225706 DEBUG nova.network.neutron [req-18cca381-cee9-4466-8a0e-9fc59ae8e7c8 req-304d948e-3dea-4dbb-bd63-4985b25c0a55 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Updating instance_info_cache with network_info: [{"id": "d744a552-c706-444a-8a15-4a98c41eed50", "address": "fa:16:3e:9f:48:6d", "network": {"id": "2fb57e44-e877-47c8-860b-b36d5b5ff599", "bridge": "br-int", "label": "tempest-network-smoke--2143346610", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd744a552-c7", "ovs_interfaceid": "d744a552-c706-444a-8a15-4a98c41eed50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:24:11 compute-2 nova_compute[225701]: 2026-01-23 10:24:11.950 225706 DEBUG oslo_concurrency.lockutils [req-18cca381-cee9-4466-8a0e-9fc59ae8e7c8 req-304d948e-3dea-4dbb-bd63-4985b25c0a55 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-bae2b00f-87e8-40b7-b7ba-972f7c531998" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:24:12 compute-2 podman[233801]: 2026-01-23 10:24:12.029512693 +0000 UTC m=+0.398395913 container remove 00f7620c8686931b35559136526f1ddfd77324ee269c01ffbfe8698ca081684a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 10:24:12 compute-2 nova_compute[225701]: 2026-01-23 10:24:12.029 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:24:12 compute-2 nova_compute[225701]: 2026-01-23 10:24:12.030 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:24:12 compute-2 nova_compute[225701]: 2026-01-23 10:24:12.030 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:24:12 compute-2 nova_compute[225701]: 2026-01-23 10:24:12.030 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:24:12 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Main process exited, code=exited, status=139/n/a
Jan 23 10:24:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:12 compute-2 nova_compute[225701]: 2026-01-23 10:24:12.164 225706 INFO nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Creating config drive at /var/lib/nova/instances/bae2b00f-87e8-40b7-b7ba-972f7c531998/disk.config
Jan 23 10:24:12 compute-2 nova_compute[225701]: 2026-01-23 10:24:12.176 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bae2b00f-87e8-40b7-b7ba-972f7c531998/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz3mh5p61 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:24:12 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/469737160' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:24:12 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 10:24:12 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.599s CPU time.
Jan 23 10:24:12 compute-2 nova_compute[225701]: 2026-01-23 10:24:12.306 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bae2b00f-87e8-40b7-b7ba-972f7c531998/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz3mh5p61" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:24:12 compute-2 nova_compute[225701]: 2026-01-23 10:24:12.336 225706 DEBUG nova.storage.rbd_utils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image bae2b00f-87e8-40b7-b7ba-972f7c531998_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:24:12 compute-2 nova_compute[225701]: 2026-01-23 10:24:12.340 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/bae2b00f-87e8-40b7-b7ba-972f7c531998/disk.config bae2b00f-87e8-40b7-b7ba-972f7c531998_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:24:12 compute-2 nova_compute[225701]: 2026-01-23 10:24:12.492 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/bae2b00f-87e8-40b7-b7ba-972f7c531998/disk.config bae2b00f-87e8-40b7-b7ba-972f7c531998_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.153s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:24:12 compute-2 nova_compute[225701]: 2026-01-23 10:24:12.493 225706 INFO nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Deleting local config drive /var/lib/nova/instances/bae2b00f-87e8-40b7-b7ba-972f7c531998/disk.config because it was imported into RBD.
Jan 23 10:24:12 compute-2 kernel: tapd744a552-c7: entered promiscuous mode
Jan 23 10:24:12 compute-2 NetworkManager[48964]: <info>  [1769163852.5628] manager: (tapd744a552-c7): new Tun device (/org/freedesktop/NetworkManager/Devices/36)
Jan 23 10:24:12 compute-2 nova_compute[225701]: 2026-01-23 10:24:12.563 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:12 compute-2 ovn_controller[132789]: 2026-01-23T10:24:12Z|00046|binding|INFO|Claiming lport d744a552-c706-444a-8a15-4a98c41eed50 for this chassis.
Jan 23 10:24:12 compute-2 ovn_controller[132789]: 2026-01-23T10:24:12Z|00047|binding|INFO|d744a552-c706-444a-8a15-4a98c41eed50: Claiming fa:16:3e:9f:48:6d 10.100.0.11
Jan 23 10:24:12 compute-2 nova_compute[225701]: 2026-01-23 10:24:12.566 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:12 compute-2 nova_compute[225701]: 2026-01-23 10:24:12.568 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:12 compute-2 nova_compute[225701]: 2026-01-23 10:24:12.573 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:12 compute-2 nova_compute[225701]: 2026-01-23 10:24:12.576 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:12 compute-2 NetworkManager[48964]: <info>  [1769163852.5773] manager: (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Jan 23 10:24:12 compute-2 NetworkManager[48964]: <info>  [1769163852.5781] manager: (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.591 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:48:6d 10.100.0.11'], port_security=['fa:16:3e:9f:48:6d 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1107750174', 'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'bae2b00f-87e8-40b7-b7ba-972f7c531998', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2fb57e44-e877-47c8-860b-b36d5b5ff599', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1107750174', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '7', 'neutron:security_group_ids': '41f899d0-e5bc-43b7-808c-efb54f22dad4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.207'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=78b908b7-6c71-4e47-8053-0540c37dfe2c, chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>], logical_port=d744a552-c706-444a-8a15-4a98c41eed50) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:24:12 compute-2 systemd-machined[194368]: New machine qemu-3-instance-00000009.
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.593 142606 INFO neutron.agent.ovn.metadata.agent [-] Port d744a552-c706-444a-8a15-4a98c41eed50 in datapath 2fb57e44-e877-47c8-860b-b36d5b5ff599 bound to our chassis
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.595 142606 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2fb57e44-e877-47c8-860b-b36d5b5ff599
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.611 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[a2b9e347-614c-44d0-9540-c8cf52bd26fa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.612 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2fb57e44-e1 in ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.614 229823 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2fb57e44-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.614 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[db786f37-7ec3-4f96-9a45-3f010b6c99ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.615 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[18f364ba-9697-4de9-8175-c3a263836bb8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.632 142723 DEBUG oslo.privsep.daemon [-] privsep: reply[1cb09208-eb6a-4952-bf55-644d18d35648]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:24:12 compute-2 systemd[1]: Started Virtual Machine qemu-3-instance-00000009.
Jan 23 10:24:12 compute-2 nova_compute[225701]: 2026-01-23 10:24:12.655 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.659 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[85ada648-006e-4e1d-a64f-582fa61f6965]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:24:12 compute-2 nova_compute[225701]: 2026-01-23 10:24:12.670 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:12 compute-2 ovn_controller[132789]: 2026-01-23T10:24:12Z|00048|binding|INFO|Setting lport d744a552-c706-444a-8a15-4a98c41eed50 ovn-installed in OVS
Jan 23 10:24:12 compute-2 ovn_controller[132789]: 2026-01-23T10:24:12Z|00049|binding|INFO|Setting lport d744a552-c706-444a-8a15-4a98c41eed50 up in Southbound
Jan 23 10:24:12 compute-2 nova_compute[225701]: 2026-01-23 10:24:12.675 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:12 compute-2 systemd-udevd[233921]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 10:24:12 compute-2 NetworkManager[48964]: <info>  [1769163852.6907] device (tapd744a552-c7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 10:24:12 compute-2 NetworkManager[48964]: <info>  [1769163852.6916] device (tapd744a552-c7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.703 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[01e8b13d-d55b-49ca-8cb6-5f2f9ad00381]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.708 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[3dc47972-a098-43f2-9886-efb7d877928c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:24:12 compute-2 NetworkManager[48964]: <info>  [1769163852.7095] manager: (tap2fb57e44-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.737 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[69b093fb-5a3a-4129-9834-947851c8c9ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.741 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[e9663c79-b5ee-4670-b198-1c0db84fb438]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:24:12 compute-2 NetworkManager[48964]: <info>  [1769163852.7659] device (tap2fb57e44-e0): carrier: link connected
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.773 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[86ea1a4c-7154-4ee6-bb06-fe9b4123c12a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.788 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[8c6d3ce8-dcb1-4aac-bf91-a4fd2f1cd7ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2fb57e44-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:4a:5f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500541, 'reachable_time': 34219, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233950, 'error': None, 'target': 'ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.801 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[e216782d-3020-435b-b450-2b4d7707babe]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe25:4a5f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 500541, 'tstamp': 500541}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233951, 'error': None, 'target': 'ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.815 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[9a1a9a51-85e9-4df8-9504-5ab4df753d13]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2fb57e44-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:4a:5f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500541, 'reachable_time': 34219, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233952, 'error': None, 'target': 'ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:24:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.844 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[fab67375-3e5f-4126-9b0e-7f1a65fce899]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:24:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:12.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.895 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[17a318de-2193-403e-baef-3b46cd957e8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.896 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2fb57e44-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.896 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.897 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2fb57e44-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:24:12 compute-2 nova_compute[225701]: 2026-01-23 10:24:12.898 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:12 compute-2 NetworkManager[48964]: <info>  [1769163852.8992] manager: (tap2fb57e44-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Jan 23 10:24:12 compute-2 kernel: tap2fb57e44-e0: entered promiscuous mode
Jan 23 10:24:12 compute-2 nova_compute[225701]: 2026-01-23 10:24:12.900 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.903 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2fb57e44-e0, col_values=(('external_ids', {'iface-id': '77b74dfc-4c39-4ac5-b1a3-1aa2c0b19a29'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:24:12 compute-2 nova_compute[225701]: 2026-01-23 10:24:12.904 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:12 compute-2 ovn_controller[132789]: 2026-01-23T10:24:12Z|00050|binding|INFO|Releasing lport 77b74dfc-4c39-4ac5-b1a3-1aa2c0b19a29 from this chassis (sb_readonly=0)
Jan 23 10:24:12 compute-2 nova_compute[225701]: 2026-01-23 10:24:12.925 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.926 142606 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2fb57e44-e877-47c8-860b-b36d5b5ff599.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2fb57e44-e877-47c8-860b-b36d5b5ff599.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.927 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[023e5ff4-2601-4184-bba4-50106786a81b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.928 142606 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: global
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]:     log         /dev/log local0 debug
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]:     log-tag     haproxy-metadata-proxy-2fb57e44-e877-47c8-860b-b36d5b5ff599
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]:     user        root
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]:     group       root
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]:     maxconn     1024
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]:     pidfile     /var/lib/neutron/external/pids/2fb57e44-e877-47c8-860b-b36d5b5ff599.pid.haproxy
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]:     daemon
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: defaults
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]:     log global
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]:     mode http
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]:     option httplog
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]:     option dontlognull
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]:     option http-server-close
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]:     option forwardfor
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]:     retries                 3
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]:     timeout http-request    30s
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]:     timeout connect         30s
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]:     timeout client          32s
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]:     timeout server          32s
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]:     timeout http-keep-alive 30s
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: listen listener
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]:     bind 169.254.169.254:80
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]:     http-request add-header X-OVN-Network-ID 2fb57e44-e877-47c8-860b-b36d5b5ff599
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 10:24:12 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.929 142606 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599', 'env', 'PROCESS_TAG=haproxy-2fb57e44-e877-47c8-860b-b36d5b5ff599', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2fb57e44-e877-47c8-860b-b36d5b5ff599.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 10:24:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:13 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:24:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:13.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:13 compute-2 ceph-mon[75771]: pgmap v928: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 29 op/s
Jan 23 10:24:13 compute-2 podman[233992]: 2026-01-23 10:24:13.333115177 +0000 UTC m=+0.057164604 container create c73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 10:24:13 compute-2 systemd[1]: Started libpod-conmon-c73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220.scope.
Jan 23 10:24:13 compute-2 systemd[1]: Started libcrun container.
Jan 23 10:24:13 compute-2 podman[233992]: 2026-01-23 10:24:13.300469625 +0000 UTC m=+0.024519072 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 10:24:13 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/71ae759b058fe5e6fdd63c6c93ecfa743881dbbef1d016ad8bb3ef3af3839996/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 10:24:13 compute-2 podman[233992]: 2026-01-23 10:24:13.418499484 +0000 UTC m=+0.142548931 container init c73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:24:13 compute-2 podman[233992]: 2026-01-23 10:24:13.425209097 +0000 UTC m=+0.149258514 container start c73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 10:24:13 compute-2 neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599[234036]: [NOTICE]   (234045) : New worker (234048) forked
Jan 23 10:24:13 compute-2 neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599[234036]: [NOTICE]   (234045) : Loading success.
Jan 23 10:24:13 compute-2 nova_compute[225701]: 2026-01-23 10:24:13.477 225706 DEBUG nova.virt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Emitting event <LifecycleEvent: 1769163853.4773085, bae2b00f-87e8-40b7-b7ba-972f7c531998 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:24:13 compute-2 nova_compute[225701]: 2026-01-23 10:24:13.478 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] VM Started (Lifecycle Event)
Jan 23 10:24:13 compute-2 nova_compute[225701]: 2026-01-23 10:24:13.645 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:24:13 compute-2 nova_compute[225701]: 2026-01-23 10:24:13.649 225706 DEBUG nova.virt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Emitting event <LifecycleEvent: 1769163853.4806626, bae2b00f-87e8-40b7-b7ba-972f7c531998 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:24:13 compute-2 nova_compute[225701]: 2026-01-23 10:24:13.650 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] VM Paused (Lifecycle Event)
Jan 23 10:24:13 compute-2 nova_compute[225701]: 2026-01-23 10:24:13.683 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:24:13 compute-2 nova_compute[225701]: 2026-01-23 10:24:13.686 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 10:24:13 compute-2 nova_compute[225701]: 2026-01-23 10:24:13.706 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 10:24:13 compute-2 nova_compute[225701]: 2026-01-23 10:24:13.750 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:14 compute-2 nova_compute[225701]: 2026-01-23 10:24:14.844 225706 DEBUG nova.compute.manager [req-56f7c1ca-5909-4eda-b941-4cff376bcf52 req-bf6de78b-18ae-48f8-8864-1658c328d7c4 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Received event network-vif-plugged-d744a552-c706-444a-8a15-4a98c41eed50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:24:14 compute-2 nova_compute[225701]: 2026-01-23 10:24:14.844 225706 DEBUG oslo_concurrency.lockutils [req-56f7c1ca-5909-4eda-b941-4cff376bcf52 req-bf6de78b-18ae-48f8-8864-1658c328d7c4 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:24:14 compute-2 nova_compute[225701]: 2026-01-23 10:24:14.845 225706 DEBUG oslo_concurrency.lockutils [req-56f7c1ca-5909-4eda-b941-4cff376bcf52 req-bf6de78b-18ae-48f8-8864-1658c328d7c4 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:24:14 compute-2 nova_compute[225701]: 2026-01-23 10:24:14.845 225706 DEBUG oslo_concurrency.lockutils [req-56f7c1ca-5909-4eda-b941-4cff376bcf52 req-bf6de78b-18ae-48f8-8864-1658c328d7c4 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:24:14 compute-2 nova_compute[225701]: 2026-01-23 10:24:14.846 225706 DEBUG nova.compute.manager [req-56f7c1ca-5909-4eda-b941-4cff376bcf52 req-bf6de78b-18ae-48f8-8864-1658c328d7c4 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Processing event network-vif-plugged-d744a552-c706-444a-8a15-4a98c41eed50 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 10:24:14 compute-2 nova_compute[225701]: 2026-01-23 10:24:14.847 225706 DEBUG nova.compute.manager [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 10:24:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:14.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:14 compute-2 nova_compute[225701]: 2026-01-23 10:24:14.852 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 10:24:14 compute-2 nova_compute[225701]: 2026-01-23 10:24:14.852 225706 DEBUG nova.virt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Emitting event <LifecycleEvent: 1769163854.8508878, bae2b00f-87e8-40b7-b7ba-972f7c531998 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:24:14 compute-2 nova_compute[225701]: 2026-01-23 10:24:14.853 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] VM Resumed (Lifecycle Event)
Jan 23 10:24:14 compute-2 nova_compute[225701]: 2026-01-23 10:24:14.857 225706 INFO nova.virt.libvirt.driver [-] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Instance spawned successfully.
Jan 23 10:24:14 compute-2 nova_compute[225701]: 2026-01-23 10:24:14.857 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 10:24:14 compute-2 nova_compute[225701]: 2026-01-23 10:24:14.882 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:24:14 compute-2 nova_compute[225701]: 2026-01-23 10:24:14.887 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:24:14 compute-2 nova_compute[225701]: 2026-01-23 10:24:14.887 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:24:14 compute-2 nova_compute[225701]: 2026-01-23 10:24:14.888 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:24:14 compute-2 nova_compute[225701]: 2026-01-23 10:24:14.888 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:24:14 compute-2 nova_compute[225701]: 2026-01-23 10:24:14.888 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:24:14 compute-2 nova_compute[225701]: 2026-01-23 10:24:14.889 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:24:14 compute-2 nova_compute[225701]: 2026-01-23 10:24:14.893 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 10:24:15 compute-2 nova_compute[225701]: 2026-01-23 10:24:15.013 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 10:24:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:15 compute-2 nova_compute[225701]: 2026-01-23 10:24:15.104 225706 INFO nova.compute.manager [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Took 8.67 seconds to spawn the instance on the hypervisor.
Jan 23 10:24:15 compute-2 nova_compute[225701]: 2026-01-23 10:24:15.104 225706 DEBUG nova.compute.manager [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:24:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:24:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:15.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:24:15 compute-2 nova_compute[225701]: 2026-01-23 10:24:15.412 225706 INFO nova.compute.manager [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Took 9.97 seconds to build instance.
Jan 23 10:24:15 compute-2 nova_compute[225701]: 2026-01-23 10:24:15.564 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:24:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:15 compute-2 ceph-mon[75771]: pgmap v929: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 29 op/s
Jan 23 10:24:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/102416 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:24:16 compute-2 nova_compute[225701]: 2026-01-23 10:24:16.654 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:24:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:16.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:24:17 compute-2 nova_compute[225701]: 2026-01-23 10:24:17.046 225706 DEBUG nova.compute.manager [req-6d096059-5dc7-48de-854f-74ff390715b3 req-acd88a38-7752-42f5-8299-812eba4a9519 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Received event network-vif-plugged-d744a552-c706-444a-8a15-4a98c41eed50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:24:17 compute-2 nova_compute[225701]: 2026-01-23 10:24:17.046 225706 DEBUG oslo_concurrency.lockutils [req-6d096059-5dc7-48de-854f-74ff390715b3 req-acd88a38-7752-42f5-8299-812eba4a9519 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:24:17 compute-2 nova_compute[225701]: 2026-01-23 10:24:17.046 225706 DEBUG oslo_concurrency.lockutils [req-6d096059-5dc7-48de-854f-74ff390715b3 req-acd88a38-7752-42f5-8299-812eba4a9519 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:24:17 compute-2 nova_compute[225701]: 2026-01-23 10:24:17.046 225706 DEBUG oslo_concurrency.lockutils [req-6d096059-5dc7-48de-854f-74ff390715b3 req-acd88a38-7752-42f5-8299-812eba4a9519 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:24:17 compute-2 nova_compute[225701]: 2026-01-23 10:24:17.047 225706 DEBUG nova.compute.manager [req-6d096059-5dc7-48de-854f-74ff390715b3 req-acd88a38-7752-42f5-8299-812eba4a9519 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] No waiting events found dispatching network-vif-plugged-d744a552-c706-444a-8a15-4a98c41eed50 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:24:17 compute-2 nova_compute[225701]: 2026-01-23 10:24:17.047 225706 WARNING nova.compute.manager [req-6d096059-5dc7-48de-854f-74ff390715b3 req-acd88a38-7752-42f5-8299-812eba4a9519 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Received unexpected event network-vif-plugged-d744a552-c706-444a-8a15-4a98c41eed50 for instance with vm_state active and task_state None.
Jan 23 10:24:17 compute-2 ceph-mon[75771]: pgmap v930: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 265 KiB/s rd, 1.8 MiB/s wr, 42 op/s
Jan 23 10:24:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:24:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:17.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:24:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:18 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:24:18 compute-2 nova_compute[225701]: 2026-01-23 10:24:18.580 225706 DEBUG oslo_concurrency.lockutils [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "bae2b00f-87e8-40b7-b7ba-972f7c531998" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:24:18 compute-2 nova_compute[225701]: 2026-01-23 10:24:18.580 225706 DEBUG oslo_concurrency.lockutils [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:24:18 compute-2 nova_compute[225701]: 2026-01-23 10:24:18.581 225706 DEBUG oslo_concurrency.lockutils [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:24:18 compute-2 nova_compute[225701]: 2026-01-23 10:24:18.581 225706 DEBUG oslo_concurrency.lockutils [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:24:18 compute-2 nova_compute[225701]: 2026-01-23 10:24:18.581 225706 DEBUG oslo_concurrency.lockutils [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:24:18 compute-2 nova_compute[225701]: 2026-01-23 10:24:18.582 225706 INFO nova.compute.manager [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Terminating instance
Jan 23 10:24:18 compute-2 nova_compute[225701]: 2026-01-23 10:24:18.583 225706 DEBUG nova.compute.manager [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 10:24:18 compute-2 kernel: tapd744a552-c7 (unregistering): left promiscuous mode
Jan 23 10:24:18 compute-2 NetworkManager[48964]: <info>  [1769163858.6362] device (tapd744a552-c7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 10:24:18 compute-2 nova_compute[225701]: 2026-01-23 10:24:18.644 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:18 compute-2 ovn_controller[132789]: 2026-01-23T10:24:18Z|00051|binding|INFO|Releasing lport d744a552-c706-444a-8a15-4a98c41eed50 from this chassis (sb_readonly=0)
Jan 23 10:24:18 compute-2 ovn_controller[132789]: 2026-01-23T10:24:18Z|00052|binding|INFO|Setting lport d744a552-c706-444a-8a15-4a98c41eed50 down in Southbound
Jan 23 10:24:18 compute-2 ovn_controller[132789]: 2026-01-23T10:24:18Z|00053|binding|INFO|Removing iface tapd744a552-c7 ovn-installed in OVS
Jan 23 10:24:18 compute-2 nova_compute[225701]: 2026-01-23 10:24:18.648 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:18 compute-2 nova_compute[225701]: 2026-01-23 10:24:18.662 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:18 compute-2 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000009.scope: Deactivated successfully.
Jan 23 10:24:18 compute-2 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000009.scope: Consumed 4.624s CPU time.
Jan 23 10:24:18 compute-2 systemd-machined[194368]: Machine qemu-3-instance-00000009 terminated.
Jan 23 10:24:18 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:18.707 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:48:6d 10.100.0.11'], port_security=['fa:16:3e:9f:48:6d 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1107750174', 'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'bae2b00f-87e8-40b7-b7ba-972f7c531998', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2fb57e44-e877-47c8-860b-b36d5b5ff599', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1107750174', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '9', 'neutron:security_group_ids': '41f899d0-e5bc-43b7-808c-efb54f22dad4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.207', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=78b908b7-6c71-4e47-8053-0540c37dfe2c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>], logical_port=d744a552-c706-444a-8a15-4a98c41eed50) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:24:18 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:18.709 142606 INFO neutron.agent.ovn.metadata.agent [-] Port d744a552-c706-444a-8a15-4a98c41eed50 in datapath 2fb57e44-e877-47c8-860b-b36d5b5ff599 unbound from our chassis
Jan 23 10:24:18 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:18.710 142606 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2fb57e44-e877-47c8-860b-b36d5b5ff599, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 10:24:18 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:18.711 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[71387733-adef-4668-8ef5-678a917bac0b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:24:18 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:18.712 142606 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599 namespace which is not needed anymore
Jan 23 10:24:18 compute-2 nova_compute[225701]: 2026-01-23 10:24:18.799 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:18 compute-2 nova_compute[225701]: 2026-01-23 10:24:18.820 225706 INFO nova.virt.libvirt.driver [-] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Instance destroyed successfully.
Jan 23 10:24:18 compute-2 nova_compute[225701]: 2026-01-23 10:24:18.821 225706 DEBUG nova.objects.instance [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'resources' on Instance uuid bae2b00f-87e8-40b7-b7ba-972f7c531998 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:24:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:18 compute-2 neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599[234036]: [NOTICE]   (234045) : haproxy version is 2.8.14-c23fe91
Jan 23 10:24:18 compute-2 neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599[234036]: [NOTICE]   (234045) : path to executable is /usr/sbin/haproxy
Jan 23 10:24:18 compute-2 neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599[234036]: [WARNING]  (234045) : Exiting Master process...
Jan 23 10:24:18 compute-2 neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599[234036]: [ALERT]    (234045) : Current worker (234048) exited with code 143 (Terminated)
Jan 23 10:24:18 compute-2 neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599[234036]: [WARNING]  (234045) : All workers exited. Exiting... (0)
Jan 23 10:24:18 compute-2 systemd[1]: libpod-c73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220.scope: Deactivated successfully.
Jan 23 10:24:18 compute-2 nova_compute[225701]: 2026-01-23 10:24:18.856 225706 DEBUG nova.virt.libvirt.vif [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:24:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1790835147',display_name='tempest-TestNetworkBasicOps-server-1790835147',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1790835147',id=9,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLwh/ci1qy20vB5FyaBepDv6KpYIxs8h6oo7gGlHu7RZtK7kr5mjuHzqdrX+yDa6v1DJrzMXWjaBuQGyTdeFGY8MLFkkRTd0XB8VJHoKHx7kcuI7EyiJu2dhMv2/NI1ZTg==',key_name='tempest-TestNetworkBasicOps-520442326',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:24:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-ox8kizyd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:24:15Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=bae2b00f-87e8-40b7-b7ba-972f7c531998,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d744a552-c706-444a-8a15-4a98c41eed50", "address": "fa:16:3e:9f:48:6d", "network": {"id": "2fb57e44-e877-47c8-860b-b36d5b5ff599", "bridge": "br-int", "label": "tempest-network-smoke--2143346610", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd744a552-c7", "ovs_interfaceid": "d744a552-c706-444a-8a15-4a98c41eed50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 10:24:18 compute-2 nova_compute[225701]: 2026-01-23 10:24:18.857 225706 DEBUG nova.network.os_vif_util [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "d744a552-c706-444a-8a15-4a98c41eed50", "address": "fa:16:3e:9f:48:6d", "network": {"id": "2fb57e44-e877-47c8-860b-b36d5b5ff599", "bridge": "br-int", "label": "tempest-network-smoke--2143346610", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd744a552-c7", "ovs_interfaceid": "d744a552-c706-444a-8a15-4a98c41eed50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:24:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:18 compute-2 nova_compute[225701]: 2026-01-23 10:24:18.857 225706 DEBUG nova.network.os_vif_util [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:48:6d,bridge_name='br-int',has_traffic_filtering=True,id=d744a552-c706-444a-8a15-4a98c41eed50,network=Network(2fb57e44-e877-47c8-860b-b36d5b5ff599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd744a552-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:24:18 compute-2 nova_compute[225701]: 2026-01-23 10:24:18.858 225706 DEBUG os_vif [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:48:6d,bridge_name='br-int',has_traffic_filtering=True,id=d744a552-c706-444a-8a15-4a98c41eed50,network=Network(2fb57e44-e877-47c8-860b-b36d5b5ff599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd744a552-c7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 10:24:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:24:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:18.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:24:18 compute-2 nova_compute[225701]: 2026-01-23 10:24:18.861 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:18 compute-2 nova_compute[225701]: 2026-01-23 10:24:18.861 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd744a552-c7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:24:18 compute-2 podman[234091]: 2026-01-23 10:24:18.862210762 +0000 UTC m=+0.048291327 container died c73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:24:18 compute-2 nova_compute[225701]: 2026-01-23 10:24:18.863 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:18 compute-2 nova_compute[225701]: 2026-01-23 10:24:18.864 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:18 compute-2 nova_compute[225701]: 2026-01-23 10:24:18.868 225706 INFO os_vif [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:48:6d,bridge_name='br-int',has_traffic_filtering=True,id=d744a552-c706-444a-8a15-4a98c41eed50,network=Network(2fb57e44-e877-47c8-860b-b36d5b5ff599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd744a552-c7')
Jan 23 10:24:18 compute-2 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220-userdata-shm.mount: Deactivated successfully.
Jan 23 10:24:18 compute-2 systemd[1]: var-lib-containers-storage-overlay-71ae759b058fe5e6fdd63c6c93ecfa743881dbbef1d016ad8bb3ef3af3839996-merged.mount: Deactivated successfully.
Jan 23 10:24:18 compute-2 podman[234091]: 2026-01-23 10:24:18.947082585 +0000 UTC m=+0.133163150 container cleanup c73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:24:18 compute-2 systemd[1]: libpod-conmon-c73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220.scope: Deactivated successfully.
Jan 23 10:24:19 compute-2 podman[234141]: 2026-01-23 10:24:19.012682066 +0000 UTC m=+0.043535000 container remove c73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 10:24:19 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:19.019 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[9716fd1d-57c3-4543-9909-eb8fd1a7b564]: (4, ('Fri Jan 23 10:24:18 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599 (c73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220)\nc73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220\nFri Jan 23 10:24:18 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599 (c73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220)\nc73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:24:19 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:19.021 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[fb5be78b-4cde-4965-911a-4c0c5c422c32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:24:19 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:19.022 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2fb57e44-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:24:19 compute-2 nova_compute[225701]: 2026-01-23 10:24:19.024 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:19 compute-2 kernel: tap2fb57e44-e0: left promiscuous mode
Jan 23 10:24:19 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:19.029 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[768fad61-53b1-44b4-8dd6-49ed11bc3804]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:24:19 compute-2 nova_compute[225701]: 2026-01-23 10:24:19.039 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:19 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:19.043 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[0ccb772a-4118-4689-98bc-8fa9cb9239e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:24:19 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:19.044 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[5615ba58-9798-4f09-b99f-9f661215b127]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:24:19 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:19.059 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[41ce2301-6cc8-4c56-95e2-6bc22183b4fe]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500534, 'reachable_time': 41585, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234159, 'error': None, 'target': 'ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:24:19 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:19.062 142723 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 10:24:19 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:19.062 142723 DEBUG oslo.privsep.daemon [-] privsep: reply[2f0bf27f-cc4f-4223-bdf6-454a8449ed03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:24:19 compute-2 systemd[1]: run-netns-ovnmeta\x2d2fb57e44\x2de877\x2d47c8\x2d860b\x2db36d5b5ff599.mount: Deactivated successfully.
Jan 23 10:24:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:24:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:19.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:24:19 compute-2 nova_compute[225701]: 2026-01-23 10:24:19.388 225706 DEBUG nova.compute.manager [req-88918fd3-b92e-4432-b441-38574c7989a0 req-c15eeadd-fd02-4d32-9f7f-1d4e7b98e5db 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Received event network-vif-unplugged-d744a552-c706-444a-8a15-4a98c41eed50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:24:19 compute-2 nova_compute[225701]: 2026-01-23 10:24:19.389 225706 DEBUG oslo_concurrency.lockutils [req-88918fd3-b92e-4432-b441-38574c7989a0 req-c15eeadd-fd02-4d32-9f7f-1d4e7b98e5db 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:24:19 compute-2 nova_compute[225701]: 2026-01-23 10:24:19.389 225706 DEBUG oslo_concurrency.lockutils [req-88918fd3-b92e-4432-b441-38574c7989a0 req-c15eeadd-fd02-4d32-9f7f-1d4e7b98e5db 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:24:19 compute-2 nova_compute[225701]: 2026-01-23 10:24:19.389 225706 DEBUG oslo_concurrency.lockutils [req-88918fd3-b92e-4432-b441-38574c7989a0 req-c15eeadd-fd02-4d32-9f7f-1d4e7b98e5db 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:24:19 compute-2 nova_compute[225701]: 2026-01-23 10:24:19.390 225706 DEBUG nova.compute.manager [req-88918fd3-b92e-4432-b441-38574c7989a0 req-c15eeadd-fd02-4d32-9f7f-1d4e7b98e5db 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] No waiting events found dispatching network-vif-unplugged-d744a552-c706-444a-8a15-4a98c41eed50 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:24:19 compute-2 nova_compute[225701]: 2026-01-23 10:24:19.390 225706 DEBUG nova.compute.manager [req-88918fd3-b92e-4432-b441-38574c7989a0 req-c15eeadd-fd02-4d32-9f7f-1d4e7b98e5db 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Received event network-vif-unplugged-d744a552-c706-444a-8a15-4a98c41eed50 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 10:24:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:20 compute-2 ceph-mon[75771]: pgmap v931: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Jan 23 10:24:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:20 compute-2 nova_compute[225701]: 2026-01-23 10:24:20.312 225706 INFO nova.virt.libvirt.driver [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Deleting instance files /var/lib/nova/instances/bae2b00f-87e8-40b7-b7ba-972f7c531998_del
Jan 23 10:24:20 compute-2 nova_compute[225701]: 2026-01-23 10:24:20.313 225706 INFO nova.virt.libvirt.driver [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Deletion of /var/lib/nova/instances/bae2b00f-87e8-40b7-b7ba-972f7c531998_del complete
Jan 23 10:24:20 compute-2 nova_compute[225701]: 2026-01-23 10:24:20.367 225706 INFO nova.compute.manager [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Took 1.78 seconds to destroy the instance on the hypervisor.
Jan 23 10:24:20 compute-2 nova_compute[225701]: 2026-01-23 10:24:20.368 225706 DEBUG oslo.service.loopingcall [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 10:24:20 compute-2 nova_compute[225701]: 2026-01-23 10:24:20.368 225706 DEBUG nova.compute.manager [-] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 10:24:20 compute-2 nova_compute[225701]: 2026-01-23 10:24:20.368 225706 DEBUG nova.network.neutron [-] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 10:24:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:24:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:20.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:21.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:21 compute-2 nova_compute[225701]: 2026-01-23 10:24:21.498 225706 DEBUG nova.compute.manager [req-6e74aab1-c147-449c-8cad-4251555d7c2e req-863ed47c-ccea-48b1-a841-af70aafd0712 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Received event network-vif-plugged-d744a552-c706-444a-8a15-4a98c41eed50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:24:21 compute-2 nova_compute[225701]: 2026-01-23 10:24:21.498 225706 DEBUG oslo_concurrency.lockutils [req-6e74aab1-c147-449c-8cad-4251555d7c2e req-863ed47c-ccea-48b1-a841-af70aafd0712 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:24:21 compute-2 nova_compute[225701]: 2026-01-23 10:24:21.498 225706 DEBUG oslo_concurrency.lockutils [req-6e74aab1-c147-449c-8cad-4251555d7c2e req-863ed47c-ccea-48b1-a841-af70aafd0712 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:24:21 compute-2 nova_compute[225701]: 2026-01-23 10:24:21.499 225706 DEBUG oslo_concurrency.lockutils [req-6e74aab1-c147-449c-8cad-4251555d7c2e req-863ed47c-ccea-48b1-a841-af70aafd0712 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:24:21 compute-2 nova_compute[225701]: 2026-01-23 10:24:21.499 225706 DEBUG nova.compute.manager [req-6e74aab1-c147-449c-8cad-4251555d7c2e req-863ed47c-ccea-48b1-a841-af70aafd0712 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] No waiting events found dispatching network-vif-plugged-d744a552-c706-444a-8a15-4a98c41eed50 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:24:21 compute-2 nova_compute[225701]: 2026-01-23 10:24:21.499 225706 WARNING nova.compute.manager [req-6e74aab1-c147-449c-8cad-4251555d7c2e req-863ed47c-ccea-48b1-a841-af70aafd0712 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Received unexpected event network-vif-plugged-d744a552-c706-444a-8a15-4a98c41eed50 for instance with vm_state active and task_state deleting.
Jan 23 10:24:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:21 compute-2 ceph-mon[75771]: pgmap v932: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 359 KiB/s wr, 86 op/s
Jan 23 10:24:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:22 compute-2 nova_compute[225701]: 2026-01-23 10:24:22.295 225706 DEBUG nova.network.neutron [-] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:24:22 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Scheduled restart job, restart counter is at 14.
Jan 23 10:24:22 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:24:22 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.599s CPU time.
Jan 23 10:24:22 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Start request repeated too quickly.
Jan 23 10:24:22 compute-2 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 10:24:22 compute-2 systemd[1]: Failed to start Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:24:22 compute-2 nova_compute[225701]: 2026-01-23 10:24:22.315 225706 INFO nova.compute.manager [-] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Took 1.95 seconds to deallocate network for instance.
Jan 23 10:24:22 compute-2 nova_compute[225701]: 2026-01-23 10:24:22.376 225706 DEBUG oslo_concurrency.lockutils [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:24:22 compute-2 nova_compute[225701]: 2026-01-23 10:24:22.377 225706 DEBUG oslo_concurrency.lockutils [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:24:22 compute-2 nova_compute[225701]: 2026-01-23 10:24:22.427 225706 DEBUG oslo_concurrency.processutils [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:24:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:22.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:22 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:24:22 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2724804691' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:24:22 compute-2 nova_compute[225701]: 2026-01-23 10:24:22.907 225706 DEBUG oslo_concurrency.processutils [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:24:22 compute-2 nova_compute[225701]: 2026-01-23 10:24:22.914 225706 DEBUG nova.compute.provider_tree [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:24:22 compute-2 nova_compute[225701]: 2026-01-23 10:24:22.936 225706 DEBUG nova.scheduler.client.report [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:24:22 compute-2 nova_compute[225701]: 2026-01-23 10:24:22.971 225706 DEBUG oslo_concurrency.lockutils [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.594s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:24:22 compute-2 nova_compute[225701]: 2026-01-23 10:24:22.994 225706 INFO nova.scheduler.client.report [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Deleted allocations for instance bae2b00f-87e8-40b7-b7ba-972f7c531998
Jan 23 10:24:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:23 compute-2 nova_compute[225701]: 2026-01-23 10:24:23.105 225706 DEBUG oslo_concurrency.lockutils [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.524s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:24:23 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:24:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:23.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:23 compute-2 podman[234188]: 2026-01-23 10:24:23.626042709 +0000 UTC m=+0.051406723 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 23 10:24:23 compute-2 podman[234187]: 2026-01-23 10:24:23.650965901 +0000 UTC m=+0.078419566 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Jan 23 10:24:23 compute-2 nova_compute[225701]: 2026-01-23 10:24:23.810 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:23 compute-2 nova_compute[225701]: 2026-01-23 10:24:23.863 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:24.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:24:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:25.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:24:25 compute-2 sudo[234231]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:24:25 compute-2 sudo[234231]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:24:25 compute-2 sudo[234231]: pam_unix(sudo:session): session closed for user root
Jan 23 10:24:25 compute-2 ceph-mon[75771]: pgmap v933: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 360 KiB/s wr, 113 op/s
Jan 23 10:24:25 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2724804691' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:24:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:25 compute-2 ceph-mon[75771]: pgmap v934: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.3 KiB/s wr, 98 op/s
Jan 23 10:24:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:26.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:27.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:27 compute-2 ceph-mon[75771]: pgmap v935: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.3 KiB/s wr, 98 op/s
Jan 23 10:24:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:28 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:24:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/102428 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:24:28 compute-2 nova_compute[225701]: 2026-01-23 10:24:28.813 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:28 compute-2 nova_compute[225701]: 2026-01-23 10:24:28.865 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:28.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:29.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:29 compute-2 ceph-mon[75771]: pgmap v936: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.2 KiB/s wr, 84 op/s
Jan 23 10:24:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:30 compute-2 nova_compute[225701]: 2026-01-23 10:24:30.567 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:30 compute-2 nova_compute[225701]: 2026-01-23 10:24:30.641 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:24:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:30.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:24:30 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:30.923 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:24:30 compute-2 nova_compute[225701]: 2026-01-23 10:24:30.924 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:30 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:30.924 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:24:30 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:30.925 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:24:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:31.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:32 compute-2 ceph-mon[75771]: pgmap v937: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 26 op/s
Jan 23 10:24:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:32.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:33 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:24:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:33.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:33 compute-2 nova_compute[225701]: 2026-01-23 10:24:33.816 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:33 compute-2 nova_compute[225701]: 2026-01-23 10:24:33.818 225706 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163858.8164032, bae2b00f-87e8-40b7-b7ba-972f7c531998 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:24:33 compute-2 nova_compute[225701]: 2026-01-23 10:24:33.818 225706 INFO nova.compute.manager [-] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] VM Stopped (Lifecycle Event)
Jan 23 10:24:33 compute-2 nova_compute[225701]: 2026-01-23 10:24:33.836 225706 DEBUG nova.compute.manager [None req-7596056e-82ba-4d7d-9765-c4ac4d5de086 - - - - - -] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:24:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:33 compute-2 ceph-mon[75771]: pgmap v938: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 26 op/s
Jan 23 10:24:33 compute-2 nova_compute[225701]: 2026-01-23 10:24:33.867 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:24:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:34.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:24:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:24:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:35.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:24:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:36.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:37.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:37 compute-2 ceph-mon[75771]: pgmap v939: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:24:37 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:24:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:38 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:24:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:38 compute-2 nova_compute[225701]: 2026-01-23 10:24:38.868 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 10:24:38 compute-2 nova_compute[225701]: 2026-01-23 10:24:38.869 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 10:24:38 compute-2 nova_compute[225701]: 2026-01-23 10:24:38.870 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 23 10:24:38 compute-2 nova_compute[225701]: 2026-01-23 10:24:38.870 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 23 10:24:38 compute-2 nova_compute[225701]: 2026-01-23 10:24:38.871 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:38 compute-2 nova_compute[225701]: 2026-01-23 10:24:38.872 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 23 10:24:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:38.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:24:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:39.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:24:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:39 compute-2 ceph-mon[75771]: pgmap v940: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:24:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:40.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:40 compute-2 ceph-mon[75771]: pgmap v941: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:24:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:41.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:42 compute-2 ceph-mon[75771]: pgmap v942: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:24:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:42.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:24:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:43.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:24:43 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:24:43 compute-2 ceph-mon[75771]: pgmap v943: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 853 B/s rd, 426 B/s wr, 1 op/s
Jan 23 10:24:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:43 compute-2 nova_compute[225701]: 2026-01-23 10:24:43.872 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:43 compute-2 nova_compute[225701]: 2026-01-23 10:24:43.874 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:44.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:24:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:45.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:24:45 compute-2 sudo[234278]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:24:45 compute-2 sudo[234278]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:24:45 compute-2 sudo[234278]: pam_unix(sudo:session): session closed for user root
Jan 23 10:24:45 compute-2 ceph-mon[75771]: pgmap v944: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 426 B/s wr, 1 op/s
Jan 23 10:24:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:24:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:46.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:24:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:47.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:48 compute-2 ceph-mon[75771]: pgmap v945: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 23 10:24:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:48 compute-2 nova_compute[225701]: 2026-01-23 10:24:48.873 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:48 compute-2 nova_compute[225701]: 2026-01-23 10:24:48.875 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:48.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:49 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:24:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:49.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:49 compute-2 ceph-mon[75771]: pgmap v946: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 341 B/s wr, 1 op/s
Jan 23 10:24:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/3038382262' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:24:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/3038382262' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:24:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:24:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:50.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:24:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:24:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:24:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:51.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:24:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:52 compute-2 ceph-mon[75771]: pgmap v947: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 341 B/s wr, 1 op/s
Jan 23 10:24:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:52.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:53 compute-2 ceph-mon[75771]: pgmap v948: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 341 B/s wr, 1 op/s
Jan 23 10:24:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:53.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:53 compute-2 nova_compute[225701]: 2026-01-23 10:24:53.876 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:53 compute-2 nova_compute[225701]: 2026-01-23 10:24:53.878 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 10:24:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:54 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:24:54 compute-2 podman[234314]: 2026-01-23 10:24:54.63048386 +0000 UTC m=+0.044041143 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 23 10:24:54 compute-2 podman[234313]: 2026-01-23 10:24:54.651400654 +0000 UTC m=+0.074066020 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 10:24:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:54.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:24:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:55.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:24:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:55.492 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:24:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:55.493 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:24:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:24:55.493 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:24:55 compute-2 sudo[234358]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:24:55 compute-2 sudo[234358]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:24:55 compute-2 sudo[234358]: pam_unix(sudo:session): session closed for user root
Jan 23 10:24:55 compute-2 sudo[234383]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 23 10:24:55 compute-2 sudo[234383]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:24:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:56 compute-2 ceph-mon[75771]: pgmap v949: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:24:56 compute-2 podman[234482]: 2026-01-23 10:24:56.287892271 +0000 UTC m=+0.196817133 container exec 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 10:24:56 compute-2 podman[234482]: 2026-01-23 10:24:56.421091631 +0000 UTC m=+0.330016473 container exec_died 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 10:24:56 compute-2 podman[234586]: 2026-01-23 10:24:56.789415804 +0000 UTC m=+0.060335222 container exec 610246eaa4f552466bc8eee8cc806d516dfcdb85055d982f77b2276e24631ff0 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 10:24:56 compute-2 podman[234586]: 2026-01-23 10:24:56.808175164 +0000 UTC m=+0.079094582 container exec_died 610246eaa4f552466bc8eee8cc806d516dfcdb85055d982f77b2276e24631ff0 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 10:24:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:24:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:56.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:24:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:57 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1853192220' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:24:57 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:24:57 compute-2 ceph-mon[75771]: pgmap v950: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 23 10:24:57 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:24:57 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 10:24:57 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:24:57 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:24:57 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 10:24:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:57.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:57 compute-2 podman[234742]: 2026-01-23 10:24:57.442432236 +0000 UTC m=+0.052678794 container exec c9fabe49ecac94420d698684f7f0ce6311c7531d91804b2e676e15edf98f3e26 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj)
Jan 23 10:24:57 compute-2 podman[234742]: 2026-01-23 10:24:57.48210197 +0000 UTC m=+0.092348538 container exec_died c9fabe49ecac94420d698684f7f0ce6311c7531d91804b2e676e15edf98f3e26 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj)
Jan 23 10:24:57 compute-2 podman[234809]: 2026-01-23 10:24:57.675748844 +0000 UTC m=+0.047900627 container exec 5f2fa996bf6c111f1884c2f59ff9b058d96f90c6e8935be1b7683b38269ec7aa (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai, io.buildah.version=1.28.2, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, io.openshift.tags=Ceph keepalived)
Jan 23 10:24:57 compute-2 podman[234809]: 2026-01-23 10:24:57.689972843 +0000 UTC m=+0.062124616 container exec_died 5f2fa996bf6c111f1884c2f59ff9b058d96f90c6e8935be1b7683b38269ec7aa (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, vcs-type=git, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, distribution-scope=public)
Jan 23 10:24:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:57 compute-2 sudo[234383]: pam_unix(sudo:session): session closed for user root
Jan 23 10:24:57 compute-2 sudo[234879]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:24:57 compute-2 sudo[234879]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:24:57 compute-2 sudo[234879]: pam_unix(sudo:session): session closed for user root
Jan 23 10:24:58 compute-2 sudo[234904]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:24:58 compute-2 sudo[234904]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:24:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:58 compute-2 sudo[234904]: pam_unix(sudo:session): session closed for user root
Jan 23 10:24:58 compute-2 sudo[234962]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:24:58 compute-2 sudo[234962]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:24:58 compute-2 sudo[234962]: pam_unix(sudo:session): session closed for user root
Jan 23 10:24:58 compute-2 sudo[234987]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Jan 23 10:24:58 compute-2 sudo[234987]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:24:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:58 compute-2 nova_compute[225701]: 2026-01-23 10:24:58.877 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 10:24:58 compute-2 nova_compute[225701]: 2026-01-23 10:24:58.878 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:58 compute-2 nova_compute[225701]: 2026-01-23 10:24:58.878 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 23 10:24:58 compute-2 nova_compute[225701]: 2026-01-23 10:24:58.878 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 23 10:24:58 compute-2 nova_compute[225701]: 2026-01-23 10:24:58.878 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 23 10:24:58 compute-2 nova_compute[225701]: 2026-01-23 10:24:58.879 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:24:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:58.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:24:58 compute-2 sudo[234987]: pam_unix(sudo:session): session closed for user root
Jan 23 10:24:59 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:24:59 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:24:59 compute-2 sudo[235032]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:24:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:59 compute-2 sudo[235032]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:24:59 compute-2 sudo[235032]: pam_unix(sudo:session): session closed for user root
Jan 23 10:24:59 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:24:59 compute-2 sudo[235057]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid f3005f84-239a-55b6-a948-8f1fb592b920 -- inventory --format=json-pretty --filter-for-batch
Jan 23 10:24:59 compute-2 sudo[235057]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:24:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:24:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:24:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:59.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:24:59 compute-2 podman[235121]: 2026-01-23 10:24:59.55207572 +0000 UTC m=+0.052044759 container create 462223d4168db8a0e9758ca7f3e67b11ab117734f21672b5b83632902c29952b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_jennings, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0)
Jan 23 10:24:59 compute-2 systemd[1]: Started libpod-conmon-462223d4168db8a0e9758ca7f3e67b11ab117734f21672b5b83632902c29952b.scope.
Jan 23 10:24:59 compute-2 podman[235121]: 2026-01-23 10:24:59.525587249 +0000 UTC m=+0.025556378 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 10:24:59 compute-2 systemd[1]: Started libcrun container.
Jan 23 10:24:59 compute-2 podman[235121]: 2026-01-23 10:24:59.643629947 +0000 UTC m=+0.143599016 container init 462223d4168db8a0e9758ca7f3e67b11ab117734f21672b5b83632902c29952b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_jennings, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250325)
Jan 23 10:24:59 compute-2 podman[235121]: 2026-01-23 10:24:59.655001066 +0000 UTC m=+0.154970105 container start 462223d4168db8a0e9758ca7f3e67b11ab117734f21672b5b83632902c29952b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_jennings, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 10:24:59 compute-2 podman[235121]: 2026-01-23 10:24:59.65883328 +0000 UTC m=+0.158802319 container attach 462223d4168db8a0e9758ca7f3e67b11ab117734f21672b5b83632902c29952b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_jennings, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 10:24:59 compute-2 awesome_jennings[235137]: 167 167
Jan 23 10:24:59 compute-2 systemd[1]: libpod-462223d4168db8a0e9758ca7f3e67b11ab117734f21672b5b83632902c29952b.scope: Deactivated successfully.
Jan 23 10:24:59 compute-2 podman[235142]: 2026-01-23 10:24:59.710347015 +0000 UTC m=+0.029140326 container died 462223d4168db8a0e9758ca7f3e67b11ab117734f21672b5b83632902c29952b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_jennings, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:24:59 compute-2 systemd[1]: var-lib-containers-storage-overlay-98f0838e9607886778be7dd3570121eb5eb3d92285ec27d408631f743dc5edab-merged.mount: Deactivated successfully.
Jan 23 10:24:59 compute-2 podman[235142]: 2026-01-23 10:24:59.749845275 +0000 UTC m=+0.068638566 container remove 462223d4168db8a0e9758ca7f3e67b11ab117734f21672b5b83632902c29952b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_jennings, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 10:24:59 compute-2 systemd[1]: libpod-conmon-462223d4168db8a0e9758ca7f3e67b11ab117734f21672b5b83632902c29952b.scope: Deactivated successfully.
Jan 23 10:24:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:24:59 compute-2 podman[235164]: 2026-01-23 10:24:59.980987589 +0000 UTC m=+0.050397898 container create 6ba9173e524f454feb97d3433d77b9d34e5b728a93342094e581e918794e2951 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_chaum, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Jan 23 10:25:00 compute-2 systemd[1]: Started libpod-conmon-6ba9173e524f454feb97d3433d77b9d34e5b728a93342094e581e918794e2951.scope.
Jan 23 10:25:00 compute-2 systemd[1]: Started libcrun container.
Jan 23 10:25:00 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a252f223d368a15287f2111f680585f7a104f5d6a0ccded61b3835a4f8c43a6d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 10:25:00 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a252f223d368a15287f2111f680585f7a104f5d6a0ccded61b3835a4f8c43a6d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 10:25:00 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a252f223d368a15287f2111f680585f7a104f5d6a0ccded61b3835a4f8c43a6d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 10:25:00 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a252f223d368a15287f2111f680585f7a104f5d6a0ccded61b3835a4f8c43a6d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 10:25:00 compute-2 podman[235164]: 2026-01-23 10:24:59.96105417 +0000 UTC m=+0.030464509 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 10:25:00 compute-2 podman[235164]: 2026-01-23 10:25:00.056111054 +0000 UTC m=+0.125521363 container init 6ba9173e524f454feb97d3433d77b9d34e5b728a93342094e581e918794e2951 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_chaum, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Jan 23 10:25:00 compute-2 podman[235164]: 2026-01-23 10:25:00.06447986 +0000 UTC m=+0.133890149 container start 6ba9173e524f454feb97d3433d77b9d34e5b728a93342094e581e918794e2951 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_chaum, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Jan 23 10:25:00 compute-2 podman[235164]: 2026-01-23 10:25:00.067707809 +0000 UTC m=+0.137118108 container attach 6ba9173e524f454feb97d3433d77b9d34e5b728a93342094e581e918794e2951 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_chaum, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1)
Jan 23 10:25:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:00 compute-2 ceph-mon[75771]: pgmap v951: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 0 op/s
Jan 23 10:25:00 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:25:00 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:25:00 compute-2 condescending_chaum[235180]: [
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:     {
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:         "available": false,
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:         "being_replaced": false,
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:         "ceph_device_lvm": false,
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:         "device_id": "QEMU_DVD-ROM_QM00001",
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:         "lsm_data": {},
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:         "lvs": [],
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:         "path": "/dev/sr0",
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:         "rejected_reasons": [
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:             "Insufficient space (<5GB)",
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:             "Has a FileSystem"
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:         ],
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:         "sys_api": {
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:             "actuators": null,
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:             "device_nodes": [
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:                 "sr0"
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:             ],
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:             "devname": "sr0",
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:             "human_readable_size": "482.00 KB",
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:             "id_bus": "ata",
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:             "model": "QEMU DVD-ROM",
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:             "nr_requests": "2",
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:             "parent": "/dev/sr0",
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:             "partitions": {},
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:             "path": "/dev/sr0",
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:             "removable": "1",
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:             "rev": "2.5+",
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:             "ro": "0",
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:             "rotational": "1",
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:             "sas_address": "",
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:             "sas_device_handle": "",
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:             "scheduler_mode": "mq-deadline",
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:             "sectors": 0,
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:             "sectorsize": "2048",
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:             "size": 493568.0,
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:             "support_discard": "2048",
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:             "type": "disk",
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:             "vendor": "QEMU"
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:         }
Jan 23 10:25:00 compute-2 condescending_chaum[235180]:     }
Jan 23 10:25:00 compute-2 condescending_chaum[235180]: ]
Jan 23 10:25:00 compute-2 systemd[1]: libpod-6ba9173e524f454feb97d3433d77b9d34e5b728a93342094e581e918794e2951.scope: Deactivated successfully.
Jan 23 10:25:00 compute-2 podman[235164]: 2026-01-23 10:25:00.833466619 +0000 UTC m=+0.902876928 container died 6ba9173e524f454feb97d3433d77b9d34e5b728a93342094e581e918794e2951 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_chaum, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 10:25:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:00 compute-2 systemd[1]: var-lib-containers-storage-overlay-a252f223d368a15287f2111f680585f7a104f5d6a0ccded61b3835a4f8c43a6d-merged.mount: Deactivated successfully.
Jan 23 10:25:00 compute-2 podman[235164]: 2026-01-23 10:25:00.880011711 +0000 UTC m=+0.949422000 container remove 6ba9173e524f454feb97d3433d77b9d34e5b728a93342094e581e918794e2951 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_chaum, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0)
Jan 23 10:25:00 compute-2 systemd[1]: libpod-conmon-6ba9173e524f454feb97d3433d77b9d34e5b728a93342094e581e918794e2951.scope: Deactivated successfully.
Jan 23 10:25:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:00.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:00 compute-2 sudo[235057]: pam_unix(sudo:session): session closed for user root
Jan 23 10:25:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:01.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:01 compute-2 ceph-mon[75771]: pgmap v952: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 0 op/s
Jan 23 10:25:01 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/207192348' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:25:01 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:25:01 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:25:01 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 10:25:01 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:25:01 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:25:01 compute-2 ceph-mon[75771]: pgmap v953: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 2.1 MiB/s wr, 32 op/s
Jan 23 10:25:01 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:25:01 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:25:01 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:25:01 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:25:01 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:25:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:02 compute-2 ceph-mon[75771]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Jan 23 10:25:02 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2995660539' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:25:02 compute-2 nova_compute[225701]: 2026-01-23 10:25:02.780 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:25:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:02.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:25:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:03.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:25:03 compute-2 nova_compute[225701]: 2026-01-23 10:25:03.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:25:03 compute-2 nova_compute[225701]: 2026-01-23 10:25:03.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:25:03 compute-2 nova_compute[225701]: 2026-01-23 10:25:03.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:25:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:03 compute-2 nova_compute[225701]: 2026-01-23 10:25:03.880 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 10:25:03 compute-2 nova_compute[225701]: 2026-01-23 10:25:03.881 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:03 compute-2 nova_compute[225701]: 2026-01-23 10:25:03.881 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 23 10:25:03 compute-2 nova_compute[225701]: 2026-01-23 10:25:03.882 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 23 10:25:03 compute-2 nova_compute[225701]: 2026-01-23 10:25:03.882 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 23 10:25:03 compute-2 nova_compute[225701]: 2026-01-23 10:25:03.883 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:04 compute-2 nova_compute[225701]: 2026-01-23 10:25:04.218 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:25:04 compute-2 ceph-mon[75771]: pgmap v954: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 2.1 MiB/s wr, 32 op/s
Jan 23 10:25:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:04.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:05.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:05 compute-2 sudo[236312]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:25:05 compute-2 sudo[236312]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:25:05 compute-2 sudo[236312]: pam_unix(sudo:session): session closed for user root
Jan 23 10:25:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:06 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:25:06 compute-2 nova_compute[225701]: 2026-01-23 10:25:06.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:25:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:25:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:06.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:25:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:07.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:08 compute-2 ceph-mon[75771]: pgmap v955: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 2.1 MiB/s wr, 32 op/s
Jan 23 10:25:08 compute-2 nova_compute[225701]: 2026-01-23 10:25:08.780 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:25:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:08 compute-2 nova_compute[225701]: 2026-01-23 10:25:08.928 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 10:25:08 compute-2 nova_compute[225701]: 2026-01-23 10:25:08.929 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:08 compute-2 nova_compute[225701]: 2026-01-23 10:25:08.929 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5045 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 23 10:25:08 compute-2 nova_compute[225701]: 2026-01-23 10:25:08.929 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 23 10:25:08 compute-2 nova_compute[225701]: 2026-01-23 10:25:08.930 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 23 10:25:08 compute-2 nova_compute[225701]: 2026-01-23 10:25:08.931 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:08.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:25:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:09.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:25:09 compute-2 ceph-mon[75771]: pgmap v956: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 29 KiB/s rd, 2.1 MiB/s wr, 43 op/s
Jan 23 10:25:09 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:25:09 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:25:09 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3677633277' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:25:09 compute-2 nova_compute[225701]: 2026-01-23 10:25:09.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:25:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:09 compute-2 nova_compute[225701]: 2026-01-23 10:25:09.946 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:25:09 compute-2 nova_compute[225701]: 2026-01-23 10:25:09.947 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:25:09 compute-2 nova_compute[225701]: 2026-01-23 10:25:09.947 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:25:09 compute-2 nova_compute[225701]: 2026-01-23 10:25:09.947 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:25:09 compute-2 nova_compute[225701]: 2026-01-23 10:25:09.948 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:25:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:10 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:25:10 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1091720736' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:25:10 compute-2 nova_compute[225701]: 2026-01-23 10:25:10.452 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:25:10 compute-2 sudo[236366]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:25:10 compute-2 sudo[236366]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:25:10 compute-2 sudo[236366]: pam_unix(sudo:session): session closed for user root
Jan 23 10:25:10 compute-2 nova_compute[225701]: 2026-01-23 10:25:10.634 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:25:10 compute-2 nova_compute[225701]: 2026-01-23 10:25:10.635 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4880MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:25:10 compute-2 nova_compute[225701]: 2026-01-23 10:25:10.635 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:25:10 compute-2 nova_compute[225701]: 2026-01-23 10:25:10.636 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:25:10 compute-2 nova_compute[225701]: 2026-01-23 10:25:10.774 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:25:10 compute-2 nova_compute[225701]: 2026-01-23 10:25:10.774 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:25:10 compute-2 nova_compute[225701]: 2026-01-23 10:25:10.792 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:25:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:10 compute-2 ceph-mon[75771]: pgmap v957: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 29 KiB/s rd, 2.1 MiB/s wr, 43 op/s
Jan 23 10:25:10 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1377665100' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:25:10 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3681678963' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:25:10 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3204813948' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:25:10 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:25:10 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:25:10 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1091720736' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:25:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:10.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:11 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:25:11 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1971763186' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:25:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:25:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:11.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:25:11 compute-2 nova_compute[225701]: 2026-01-23 10:25:11.282 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:25:11 compute-2 nova_compute[225701]: 2026-01-23 10:25:11.288 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:25:11 compute-2 nova_compute[225701]: 2026-01-23 10:25:11.301 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:25:11 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:25:11 compute-2 nova_compute[225701]: 2026-01-23 10:25:11.322 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:25:11 compute-2 nova_compute[225701]: 2026-01-23 10:25:11.322 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.687s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:25:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:12 compute-2 nova_compute[225701]: 2026-01-23 10:25:12.323 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:25:12 compute-2 nova_compute[225701]: 2026-01-23 10:25:12.323 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:25:12 compute-2 nova_compute[225701]: 2026-01-23 10:25:12.324 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:25:12 compute-2 nova_compute[225701]: 2026-01-23 10:25:12.324 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:25:12 compute-2 nova_compute[225701]: 2026-01-23 10:25:12.324 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:25:12 compute-2 nova_compute[225701]: 2026-01-23 10:25:12.324 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:25:12 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1971763186' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:25:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:12.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.017547) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163913017765, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 1267, "num_deletes": 251, "total_data_size": 3164328, "memory_usage": 3227232, "flush_reason": "Manual Compaction"}
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163913030854, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 2012832, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 29978, "largest_seqno": 31239, "table_properties": {"data_size": 2007176, "index_size": 2987, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12560, "raw_average_key_size": 20, "raw_value_size": 1995757, "raw_average_value_size": 3234, "num_data_blocks": 128, "num_entries": 617, "num_filter_entries": 617, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163822, "oldest_key_time": 1769163822, "file_creation_time": 1769163913, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 13331 microseconds, and 5871 cpu microseconds.
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.030930) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 2012832 bytes OK
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.030958) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.032879) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.032898) EVENT_LOG_v1 {"time_micros": 1769163913032894, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.032919) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 3158232, prev total WAL file size 3158232, number of live WAL files 2.
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.033992) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(1965KB)], [57(12MB)]
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163913034349, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 14774089, "oldest_snapshot_seqno": -1}
Jan 23 10:25:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 5879 keys, 12616502 bytes, temperature: kUnknown
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163913164851, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 12616502, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12578658, "index_size": 22054, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14725, "raw_key_size": 152049, "raw_average_key_size": 25, "raw_value_size": 12473824, "raw_average_value_size": 2121, "num_data_blocks": 881, "num_entries": 5879, "num_filter_entries": 5879, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769163913, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.165139) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 12616502 bytes
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.167531) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 113.1 rd, 96.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 12.2 +0.0 blob) out(12.0 +0.0 blob), read-write-amplify(13.6) write-amplify(6.3) OK, records in: 6400, records dropped: 521 output_compression: NoCompression
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.167547) EVENT_LOG_v1 {"time_micros": 1769163913167540, "job": 34, "event": "compaction_finished", "compaction_time_micros": 130591, "compaction_time_cpu_micros": 55216, "output_level": 6, "num_output_files": 1, "total_output_size": 12616502, "num_input_records": 6400, "num_output_records": 5879, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163913168099, "job": 34, "event": "table_file_deletion", "file_number": 59}
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163913170252, "job": 34, "event": "table_file_deletion", "file_number": 57}
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.033742) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.170310) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.170314) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.170316) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.170318) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:25:13 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.170320) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:25:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:13.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:13 compute-2 ceph-mon[75771]: pgmap v958: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 2.1 MiB/s wr, 119 op/s
Jan 23 10:25:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:13 compute-2 ovn_controller[132789]: 2026-01-23T10:25:13Z|00054|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 23 10:25:13 compute-2 nova_compute[225701]: 2026-01-23 10:25:13.931 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:14 compute-2 ceph-mon[75771]: pgmap v959: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 23 10:25:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:14.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:25:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:15.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:25:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:16 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:25:16 compute-2 ceph-mon[75771]: pgmap v960: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 23 10:25:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:25:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:16.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:25:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:17.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:18 compute-2 ceph-mon[75771]: pgmap v961: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 23 10:25:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:25:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:18.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:25:18 compute-2 nova_compute[225701]: 2026-01-23 10:25:18.975 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 10:25:18 compute-2 nova_compute[225701]: 2026-01-23 10:25:18.976 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:18 compute-2 nova_compute[225701]: 2026-01-23 10:25:18.976 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5043 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 23 10:25:18 compute-2 nova_compute[225701]: 2026-01-23 10:25:18.976 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 23 10:25:18 compute-2 nova_compute[225701]: 2026-01-23 10:25:18.977 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 23 10:25:18 compute-2 nova_compute[225701]: 2026-01-23 10:25:18.977 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:25:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:19.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:25:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:20 compute-2 ceph-mon[75771]: pgmap v962: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 64 op/s
Jan 23 10:25:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:25:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:20.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:25:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:25:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:21.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:25:21 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:25:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:22 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:25:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:22.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:25:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:23.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:25:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:23 compute-2 nova_compute[225701]: 2026-01-23 10:25:23.979 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:24 compute-2 ceph-mon[75771]: pgmap v963: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 123 op/s
Jan 23 10:25:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:24.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:25 compute-2 ceph-mon[75771]: pgmap v964: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 297 KiB/s rd, 2.1 MiB/s wr, 59 op/s
Jan 23 10:25:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:25.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:25 compute-2 sudo[236427]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:25:25 compute-2 sudo[236427]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:25:25 compute-2 sudo[236427]: pam_unix(sudo:session): session closed for user root
Jan 23 10:25:25 compute-2 podman[236451]: 2026-01-23 10:25:25.666941337 +0000 UTC m=+0.080109449 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 10:25:25 compute-2 podman[236448]: 2026-01-23 10:25:25.690793162 +0000 UTC m=+0.101726368 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 23 10:25:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:26 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:25:26 compute-2 ceph-mon[75771]: pgmap v965: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 297 KiB/s rd, 2.1 MiB/s wr, 59 op/s
Jan 23 10:25:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:25:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:26.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:25:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:25:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:27.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:25:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:28 compute-2 ceph-mon[75771]: pgmap v966: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 331 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Jan 23 10:25:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:28.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:28 compute-2 nova_compute[225701]: 2026-01-23 10:25:28.980 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:29.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:30 compute-2 ceph-mon[75771]: pgmap v967: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 330 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Jan 23 10:25:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:30.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:31.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:31 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:25:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:32 compute-2 nova_compute[225701]: 2026-01-23 10:25:32.040 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:32 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:25:32.041 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:25:32 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:25:32.044 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:25:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:32 compute-2 ceph-mon[75771]: pgmap v968: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 331 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Jan 23 10:25:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:25:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:32.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:25:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:33.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:33 compute-2 nova_compute[225701]: 2026-01-23 10:25:33.981 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:34 compute-2 ceph-mon[75771]: pgmap v969: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 34 KiB/s rd, 14 KiB/s wr, 6 op/s
Jan 23 10:25:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:25:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:34.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:25:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:35.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:25:36 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:25:36.046 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:25:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:36 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:25:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:36 compute-2 ceph-mon[75771]: pgmap v970: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 34 KiB/s rd, 14 KiB/s wr, 6 op/s
Jan 23 10:25:36 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/976421879' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:25:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:36.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:25:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:37.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:25:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:38 compute-2 ceph-mon[75771]: pgmap v971: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 54 KiB/s rd, 15 KiB/s wr, 34 op/s
Jan 23 10:25:38 compute-2 nova_compute[225701]: 2026-01-23 10:25:38.983 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:38.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:25:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:39.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:25:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:40 compute-2 ceph-mon[75771]: pgmap v972: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 15 KiB/s wr, 28 op/s
Jan 23 10:25:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:40.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:25:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:41.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:25:41 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:25:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:25:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:42.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:25:43 compute-2 ceph-mon[75771]: pgmap v973: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 15 KiB/s wr, 29 op/s
Jan 23 10:25:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:43.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:43 compute-2 nova_compute[225701]: 2026-01-23 10:25:43.984 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:43 compute-2 nova_compute[225701]: 2026-01-23 10:25:43.986 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:44 compute-2 ceph-mon[75771]: pgmap v974: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Jan 23 10:25:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:44.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:45.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:45 compute-2 sudo[236516]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:25:45 compute-2 sudo[236516]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:25:45 compute-2 sudo[236516]: pam_unix(sudo:session): session closed for user root
Jan 23 10:25:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:46 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:25:46 compute-2 ceph-mon[75771]: pgmap v975: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Jan 23 10:25:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:46.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:47.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/102547 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:25:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [ALERT] 022/102547 (4) : backend 'backend' has no server available!
Jan 23 10:25:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:48 compute-2 ceph-mon[75771]: pgmap v976: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Jan 23 10:25:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:48 compute-2 nova_compute[225701]: 2026-01-23 10:25:48.986 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:25:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:49.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:25:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:49.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/384635064' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:25:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/384635064' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:25:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:50 compute-2 ceph-mon[75771]: pgmap v977: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:25:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:25:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:51.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:25:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:51.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:25:51 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:25:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:52 compute-2 ceph-mon[75771]: pgmap v978: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:25:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:53.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:53.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:53 compute-2 nova_compute[225701]: 2026-01-23 10:25:53.521 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:25:53 compute-2 nova_compute[225701]: 2026-01-23 10:25:53.521 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:25:53 compute-2 nova_compute[225701]: 2026-01-23 10:25:53.543 225706 DEBUG nova.compute.manager [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 10:25:53 compute-2 nova_compute[225701]: 2026-01-23 10:25:53.612 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:25:53 compute-2 nova_compute[225701]: 2026-01-23 10:25:53.613 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:25:53 compute-2 nova_compute[225701]: 2026-01-23 10:25:53.619 225706 DEBUG nova.virt.hardware [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 10:25:53 compute-2 nova_compute[225701]: 2026-01-23 10:25:53.620 225706 INFO nova.compute.claims [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Claim successful on node compute-2.ctlplane.example.com
Jan 23 10:25:53 compute-2 nova_compute[225701]: 2026-01-23 10:25:53.717 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:25:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:53 compute-2 nova_compute[225701]: 2026-01-23 10:25:53.989 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:54 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:25:54 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4284066152' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:25:54 compute-2 nova_compute[225701]: 2026-01-23 10:25:54.220 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:25:54 compute-2 nova_compute[225701]: 2026-01-23 10:25:54.225 225706 DEBUG nova.compute.provider_tree [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:25:54 compute-2 nova_compute[225701]: 2026-01-23 10:25:54.239 225706 DEBUG nova.scheduler.client.report [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:25:54 compute-2 nova_compute[225701]: 2026-01-23 10:25:54.266 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.653s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:25:54 compute-2 nova_compute[225701]: 2026-01-23 10:25:54.267 225706 DEBUG nova.compute.manager [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 10:25:54 compute-2 nova_compute[225701]: 2026-01-23 10:25:54.322 225706 DEBUG nova.compute.manager [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 10:25:54 compute-2 nova_compute[225701]: 2026-01-23 10:25:54.322 225706 DEBUG nova.network.neutron [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 10:25:54 compute-2 nova_compute[225701]: 2026-01-23 10:25:54.348 225706 INFO nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 10:25:54 compute-2 nova_compute[225701]: 2026-01-23 10:25:54.365 225706 DEBUG nova.compute.manager [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 10:25:54 compute-2 nova_compute[225701]: 2026-01-23 10:25:54.455 225706 DEBUG nova.compute.manager [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 10:25:54 compute-2 nova_compute[225701]: 2026-01-23 10:25:54.457 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 10:25:54 compute-2 nova_compute[225701]: 2026-01-23 10:25:54.457 225706 INFO nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Creating image(s)
Jan 23 10:25:54 compute-2 nova_compute[225701]: 2026-01-23 10:25:54.491 225706 DEBUG nova.storage.rbd_utils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:25:54 compute-2 nova_compute[225701]: 2026-01-23 10:25:54.521 225706 DEBUG nova.storage.rbd_utils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:25:54 compute-2 nova_compute[225701]: 2026-01-23 10:25:54.546 225706 DEBUG nova.storage.rbd_utils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:25:54 compute-2 nova_compute[225701]: 2026-01-23 10:25:54.549 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:25:54 compute-2 ceph-mon[75771]: pgmap v979: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:25:54 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/4284066152' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:25:54 compute-2 nova_compute[225701]: 2026-01-23 10:25:54.606 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:25:54 compute-2 nova_compute[225701]: 2026-01-23 10:25:54.608 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "379b2821245bc82aa5a95839eddb9a97716b559c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:25:54 compute-2 nova_compute[225701]: 2026-01-23 10:25:54.608 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:25:54 compute-2 nova_compute[225701]: 2026-01-23 10:25:54.609 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:25:54 compute-2 nova_compute[225701]: 2026-01-23 10:25:54.637 225706 DEBUG nova.storage.rbd_utils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:25:54 compute-2 nova_compute[225701]: 2026-01-23 10:25:54.641 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:25:54 compute-2 nova_compute[225701]: 2026-01-23 10:25:54.804 225706 DEBUG nova.policy [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f459c4e71e6c47acb0f8aaf83f34695e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 10:25:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:54 compute-2 nova_compute[225701]: 2026-01-23 10:25:54.944 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.303s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:25:55 compute-2 nova_compute[225701]: 2026-01-23 10:25:55.009 225706 DEBUG nova.storage.rbd_utils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] resizing rbd image 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Jan 23 10:25:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:55.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:55 compute-2 nova_compute[225701]: 2026-01-23 10:25:55.119 225706 DEBUG nova.objects.instance [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'migration_context' on Instance uuid 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:25:55 compute-2 nova_compute[225701]: 2026-01-23 10:25:55.138 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 10:25:55 compute-2 nova_compute[225701]: 2026-01-23 10:25:55.139 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Ensure instance console log exists: /var/lib/nova/instances/65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 10:25:55 compute-2 nova_compute[225701]: 2026-01-23 10:25:55.139 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:25:55 compute-2 nova_compute[225701]: 2026-01-23 10:25:55.139 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:25:55 compute-2 nova_compute[225701]: 2026-01-23 10:25:55.140 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:25:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:55.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:25:55.493 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:25:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:25:55.494 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:25:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:25:55.494 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:25:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:55 compute-2 nova_compute[225701]: 2026-01-23 10:25:55.936 225706 DEBUG nova.network.neutron [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Successfully created port: 2611e513-4316-4421-8b89-1c0f37157967 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 10:25:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:56 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:25:56 compute-2 ceph-mon[75771]: pgmap v980: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:25:56 compute-2 podman[236742]: 2026-01-23 10:25:56.636810727 +0000 UTC m=+0.055503634 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 23 10:25:56 compute-2 podman[236741]: 2026-01-23 10:25:56.662853216 +0000 UTC m=+0.083897571 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 23 10:25:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:56 compute-2 nova_compute[225701]: 2026-01-23 10:25:56.915 225706 DEBUG nova.network.neutron [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Successfully updated port: 2611e513-4316-4421-8b89-1c0f37157967 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 10:25:56 compute-2 nova_compute[225701]: 2026-01-23 10:25:56.930 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:25:56 compute-2 nova_compute[225701]: 2026-01-23 10:25:56.930 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquired lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:25:56 compute-2 nova_compute[225701]: 2026-01-23 10:25:56.930 225706 DEBUG nova.network.neutron [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 10:25:56 compute-2 nova_compute[225701]: 2026-01-23 10:25:56.990 225706 DEBUG nova.compute.manager [req-37a62512-3741-49c5-ab99-8ec2a66776e8 req-57b12e5a-c6be-4e5f-a9cd-33a092da7297 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-changed-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:25:56 compute-2 nova_compute[225701]: 2026-01-23 10:25:56.990 225706 DEBUG nova.compute.manager [req-37a62512-3741-49c5-ab99-8ec2a66776e8 req-57b12e5a-c6be-4e5f-a9cd-33a092da7297 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Refreshing instance network info cache due to event network-changed-2611e513-4316-4421-8b89-1c0f37157967. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 10:25:56 compute-2 nova_compute[225701]: 2026-01-23 10:25:56.991 225706 DEBUG oslo_concurrency.lockutils [req-37a62512-3741-49c5-ab99-8ec2a66776e8 req-57b12e5a-c6be-4e5f-a9cd-33a092da7297 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:25:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:57.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:57 compute-2 nova_compute[225701]: 2026-01-23 10:25:57.042 225706 DEBUG nova.network.neutron [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 10:25:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:57.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:57 compute-2 nova_compute[225701]: 2026-01-23 10:25:57.982 225706 DEBUG nova.network.neutron [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updating instance_info_cache with network_info: [{"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.000 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Releasing lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.001 225706 DEBUG nova.compute.manager [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Instance network_info: |[{"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.001 225706 DEBUG oslo_concurrency.lockutils [req-37a62512-3741-49c5-ab99-8ec2a66776e8 req-57b12e5a-c6be-4e5f-a9cd-33a092da7297 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.001 225706 DEBUG nova.network.neutron [req-37a62512-3741-49c5-ab99-8ec2a66776e8 req-57b12e5a-c6be-4e5f-a9cd-33a092da7297 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Refreshing network info cache for port 2611e513-4316-4421-8b89-1c0f37157967 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.003 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Start _get_guest_xml network_info=[{"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_options': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '271ec98e-d058-421b-bbfb-4b4a5954c90a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.007 225706 WARNING nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.015 225706 DEBUG nova.virt.libvirt.host [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.016 225706 DEBUG nova.virt.libvirt.host [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.021 225706 DEBUG nova.virt.libvirt.host [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.022 225706 DEBUG nova.virt.libvirt.host [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.022 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.022 225706 DEBUG nova.virt.hardware [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T10:15:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1d8c8bf4-786e-4009-bc53-f259480fb5b3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.023 225706 DEBUG nova.virt.hardware [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.023 225706 DEBUG nova.virt.hardware [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.023 225706 DEBUG nova.virt.hardware [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.024 225706 DEBUG nova.virt.hardware [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.024 225706 DEBUG nova.virt.hardware [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.024 225706 DEBUG nova.virt.hardware [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.024 225706 DEBUG nova.virt.hardware [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.025 225706 DEBUG nova.virt.hardware [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.025 225706 DEBUG nova.virt.hardware [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.025 225706 DEBUG nova.virt.hardware [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.028 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:25:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:58 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 10:25:58 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1766427025' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.498 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.524 225706 DEBUG nova.storage.rbd_utils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.528 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:25:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:58 compute-2 ceph-mon[75771]: pgmap v981: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:25:58 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1766427025' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:25:58 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 10:25:58 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2805498511' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:25:58 compute-2 nova_compute[225701]: 2026-01-23 10:25:58.990 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:59.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.020 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.021 225706 DEBUG nova.virt.libvirt.vif [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:25:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1129329512',display_name='tempest-TestNetworkBasicOps-server-1129329512',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1129329512',id=11,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBOLllCuGpYDHB8HQl4gVCADogEY6z7uz5xBJbTjU7iL3TTWWE5uwU0nWT40qz7D0IhyDFXlwX4fWDCogYSyOPhCdGvOGsxFut3XTWNKcRsbqCULLjO4VMFh09pWX8E0IA==',key_name='tempest-TestNetworkBasicOps-1378329290',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-6r33a8b6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:25:54Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.021 225706 DEBUG nova.network.os_vif_util [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.022 225706 DEBUG nova.network.os_vif_util [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:a8:f2,bridge_name='br-int',has_traffic_filtering=True,id=2611e513-4316-4421-8b89-1c0f37157967,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2611e513-43') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.023 225706 DEBUG nova.objects.instance [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'pci_devices' on Instance uuid 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.042 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] End _get_guest_xml xml=<domain type="kvm">
Jan 23 10:25:59 compute-2 nova_compute[225701]:   <uuid>65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad</uuid>
Jan 23 10:25:59 compute-2 nova_compute[225701]:   <name>instance-0000000b</name>
Jan 23 10:25:59 compute-2 nova_compute[225701]:   <memory>131072</memory>
Jan 23 10:25:59 compute-2 nova_compute[225701]:   <vcpu>1</vcpu>
Jan 23 10:25:59 compute-2 nova_compute[225701]:   <metadata>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <nova:name>tempest-TestNetworkBasicOps-server-1129329512</nova:name>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <nova:creationTime>2026-01-23 10:25:58</nova:creationTime>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <nova:flavor name="m1.nano">
Jan 23 10:25:59 compute-2 nova_compute[225701]:         <nova:memory>128</nova:memory>
Jan 23 10:25:59 compute-2 nova_compute[225701]:         <nova:disk>1</nova:disk>
Jan 23 10:25:59 compute-2 nova_compute[225701]:         <nova:swap>0</nova:swap>
Jan 23 10:25:59 compute-2 nova_compute[225701]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 10:25:59 compute-2 nova_compute[225701]:         <nova:vcpus>1</nova:vcpus>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       </nova:flavor>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <nova:owner>
Jan 23 10:25:59 compute-2 nova_compute[225701]:         <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 10:25:59 compute-2 nova_compute[225701]:         <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       </nova:owner>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <nova:ports>
Jan 23 10:25:59 compute-2 nova_compute[225701]:         <nova:port uuid="2611e513-4316-4421-8b89-1c0f37157967">
Jan 23 10:25:59 compute-2 nova_compute[225701]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:         </nova:port>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       </nova:ports>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     </nova:instance>
Jan 23 10:25:59 compute-2 nova_compute[225701]:   </metadata>
Jan 23 10:25:59 compute-2 nova_compute[225701]:   <sysinfo type="smbios">
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <system>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <entry name="manufacturer">RDO</entry>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <entry name="product">OpenStack Compute</entry>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <entry name="serial">65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad</entry>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <entry name="uuid">65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad</entry>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <entry name="family">Virtual Machine</entry>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     </system>
Jan 23 10:25:59 compute-2 nova_compute[225701]:   </sysinfo>
Jan 23 10:25:59 compute-2 nova_compute[225701]:   <os>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <boot dev="hd"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <smbios mode="sysinfo"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:   </os>
Jan 23 10:25:59 compute-2 nova_compute[225701]:   <features>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <acpi/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <apic/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <vmcoreinfo/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:   </features>
Jan 23 10:25:59 compute-2 nova_compute[225701]:   <clock offset="utc">
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <timer name="hpet" present="no"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:   </clock>
Jan 23 10:25:59 compute-2 nova_compute[225701]:   <cpu mode="host-model" match="exact">
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:   </cpu>
Jan 23 10:25:59 compute-2 nova_compute[225701]:   <devices>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <disk type="network" device="disk">
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <driver type="raw" cache="none"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <source protocol="rbd" name="vms/65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk">
Jan 23 10:25:59 compute-2 nova_compute[225701]:         <host name="192.168.122.100" port="6789"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:         <host name="192.168.122.102" port="6789"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:         <host name="192.168.122.101" port="6789"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       </source>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <auth username="openstack">
Jan 23 10:25:59 compute-2 nova_compute[225701]:         <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       </auth>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <target dev="vda" bus="virtio"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     </disk>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <disk type="network" device="cdrom">
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <driver type="raw" cache="none"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <source protocol="rbd" name="vms/65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk.config">
Jan 23 10:25:59 compute-2 nova_compute[225701]:         <host name="192.168.122.100" port="6789"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:         <host name="192.168.122.102" port="6789"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:         <host name="192.168.122.101" port="6789"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       </source>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <auth username="openstack">
Jan 23 10:25:59 compute-2 nova_compute[225701]:         <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       </auth>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <target dev="sda" bus="sata"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     </disk>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <interface type="ethernet">
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <mac address="fa:16:3e:58:a8:f2"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <model type="virtio"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <mtu size="1442"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <target dev="tap2611e513-43"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     </interface>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <serial type="pty">
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <log file="/var/lib/nova/instances/65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad/console.log" append="off"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     </serial>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <video>
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <model type="virtio"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     </video>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <input type="tablet" bus="usb"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <rng model="virtio">
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <backend model="random">/dev/urandom</backend>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     </rng>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <controller type="usb" index="0"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     <memballoon model="virtio">
Jan 23 10:25:59 compute-2 nova_compute[225701]:       <stats period="10"/>
Jan 23 10:25:59 compute-2 nova_compute[225701]:     </memballoon>
Jan 23 10:25:59 compute-2 nova_compute[225701]:   </devices>
Jan 23 10:25:59 compute-2 nova_compute[225701]: </domain>
Jan 23 10:25:59 compute-2 nova_compute[225701]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.042 225706 DEBUG nova.compute.manager [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Preparing to wait for external event network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.043 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.043 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.043 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.044 225706 DEBUG nova.virt.libvirt.vif [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:25:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1129329512',display_name='tempest-TestNetworkBasicOps-server-1129329512',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1129329512',id=11,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBOLllCuGpYDHB8HQl4gVCADogEY6z7uz5xBJbTjU7iL3TTWWE5uwU0nWT40qz7D0IhyDFXlwX4fWDCogYSyOPhCdGvOGsxFut3XTWNKcRsbqCULLjO4VMFh09pWX8E0IA==',key_name='tempest-TestNetworkBasicOps-1378329290',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-6r33a8b6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:25:54Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.045 225706 DEBUG nova.network.os_vif_util [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.046 225706 DEBUG nova.network.os_vif_util [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:a8:f2,bridge_name='br-int',has_traffic_filtering=True,id=2611e513-4316-4421-8b89-1c0f37157967,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2611e513-43') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.046 225706 DEBUG os_vif [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:a8:f2,bridge_name='br-int',has_traffic_filtering=True,id=2611e513-4316-4421-8b89-1c0f37157967,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2611e513-43') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.047 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.047 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.048 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.053 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.054 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2611e513-43, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.055 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2611e513-43, col_values=(('external_ids', {'iface-id': '2611e513-4316-4421-8b89-1c0f37157967', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:58:a8:f2', 'vm-uuid': '65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.056 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:59 compute-2 NetworkManager[48964]: <info>  [1769163959.0570] manager: (tap2611e513-43): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.059 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.062 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.063 225706 INFO os_vif [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:a8:f2,bridge_name='br-int',has_traffic_filtering=True,id=2611e513-4316-4421-8b89-1c0f37157967,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2611e513-43')
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.116 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 10:25:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.117 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.117 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No VIF found with MAC fa:16:3e:58:a8:f2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.118 225706 INFO nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Using config drive
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.139 225706 DEBUG nova.storage.rbd_utils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:25:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:25:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:25:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:59.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.449 225706 DEBUG nova.network.neutron [req-37a62512-3741-49c5-ab99-8ec2a66776e8 req-57b12e5a-c6be-4e5f-a9cd-33a092da7297 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updated VIF entry in instance network info cache for port 2611e513-4316-4421-8b89-1c0f37157967. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.450 225706 DEBUG nova.network.neutron [req-37a62512-3741-49c5-ab99-8ec2a66776e8 req-57b12e5a-c6be-4e5f-a9cd-33a092da7297 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updating instance_info_cache with network_info: [{"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.473 225706 DEBUG oslo_concurrency.lockutils [req-37a62512-3741-49c5-ab99-8ec2a66776e8 req-57b12e5a-c6be-4e5f-a9cd-33a092da7297 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.492 225706 INFO nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Creating config drive at /var/lib/nova/instances/65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad/disk.config
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.496 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpczvl3v5l execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.619 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpczvl3v5l" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.646 225706 DEBUG nova.storage.rbd_utils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.650 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad/disk.config 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.823 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad/disk.config 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.173s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.824 225706 INFO nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Deleting local config drive /var/lib/nova/instances/65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad/disk.config because it was imported into RBD.
Jan 23 10:25:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:25:59 compute-2 kernel: tap2611e513-43: entered promiscuous mode
Jan 23 10:25:59 compute-2 NetworkManager[48964]: <info>  [1769163959.8805] manager: (tap2611e513-43): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Jan 23 10:25:59 compute-2 ovn_controller[132789]: 2026-01-23T10:25:59Z|00055|binding|INFO|Claiming lport 2611e513-4316-4421-8b89-1c0f37157967 for this chassis.
Jan 23 10:25:59 compute-2 ovn_controller[132789]: 2026-01-23T10:25:59Z|00056|binding|INFO|2611e513-4316-4421-8b89-1c0f37157967: Claiming fa:16:3e:58:a8:f2 10.100.0.13
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.880 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.884 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.894 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:59 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:25:59.905 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:a8:f2 10.100.0.13'], port_security=['fa:16:3e:58:a8:f2 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-712c0ef6-fbbe-4577-b44d-9610116b414a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1d01fb50-5068-4dfb-b608-e6e67ad89b2a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3547f5ca-ca7c-4ba0-a5f8-3ad2055eb8ec, chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>], logical_port=2611e513-4316-4421-8b89-1c0f37157967) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:25:59 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:25:59.907 142606 INFO neutron.agent.ovn.metadata.agent [-] Port 2611e513-4316-4421-8b89-1c0f37157967 in datapath 712c0ef6-fbbe-4577-b44d-9610116b414a bound to our chassis
Jan 23 10:25:59 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:25:59.910 142606 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 712c0ef6-fbbe-4577-b44d-9610116b414a
Jan 23 10:25:59 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2805498511' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:25:59 compute-2 systemd-machined[194368]: New machine qemu-4-instance-0000000b.
Jan 23 10:25:59 compute-2 systemd-udevd[236921]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 10:25:59 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:25:59.926 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[3580955f-fda1-42b7-ae7c-ef57513e90f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:59 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:25:59.928 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap712c0ef6-f1 in ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 10:25:59 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:25:59.929 229823 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap712c0ef6-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 10:25:59 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:25:59.929 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[21a603aa-e13f-4389-9141-d50f7a9d132f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:59 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:25:59.931 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[584e043c-cd28-4763-909b-908eca1c5eb0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:59 compute-2 NetworkManager[48964]: <info>  [1769163959.9330] device (tap2611e513-43): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 10:25:59 compute-2 NetworkManager[48964]: <info>  [1769163959.9336] device (tap2611e513-43): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 10:25:59 compute-2 systemd[1]: Started Virtual Machine qemu-4-instance-0000000b.
Jan 23 10:25:59 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:25:59.946 142723 DEBUG oslo.privsep.daemon [-] privsep: reply[52df5c38-8da3-4a11-9cb3-efc6c6a9c483]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.956 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:59 compute-2 ovn_controller[132789]: 2026-01-23T10:25:59Z|00057|binding|INFO|Setting lport 2611e513-4316-4421-8b89-1c0f37157967 ovn-installed in OVS
Jan 23 10:25:59 compute-2 ovn_controller[132789]: 2026-01-23T10:25:59Z|00058|binding|INFO|Setting lport 2611e513-4316-4421-8b89-1c0f37157967 up in Southbound
Jan 23 10:25:59 compute-2 nova_compute[225701]: 2026-01-23 10:25:59.961 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:59 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:25:59.962 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[7f976c9b-4c43-4aeb-979d-4c4d562de40f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:59 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:25:59.992 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[ba86b8e1-0766-478c-bc3a-62962b434aa6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:59 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:25:59.997 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[7c691b1c-5287-451a-9aa8-e3e2eff4ab53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:59 compute-2 NetworkManager[48964]: <info>  [1769163959.9988] manager: (tap712c0ef6-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.027 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[f6fcbb37-0223-41be-b3d8-3be3c015460a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.030 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[321a0823-dec4-474d-976d-23b2df9669cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:00 compute-2 NetworkManager[48964]: <info>  [1769163960.0486] device (tap712c0ef6-f0): carrier: link connected
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.052 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[c6c43b41-439f-41d2-bc23-a5011eed7c06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.069 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[1f271144-a9b9-4476-ac5e-12a5b9c5acc9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap712c0ef6-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d0:ec:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511269, 'reachable_time': 34347, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236955, 'error': None, 'target': 'ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.083 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[565ec319-9745-4372-83a6-7ea0ca9846ea]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed0:ec06'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 511269, 'tstamp': 511269}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236956, 'error': None, 'target': 'ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.098 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[40f81d93-cc84-45ae-9b96-e2ac4552aff9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap712c0ef6-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d0:ec:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511269, 'reachable_time': 34347, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236957, 'error': None, 'target': 'ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.127 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[622dd496-0dc3-49bc-9abb-f7fea9f9f3df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.180 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[ad97e76a-27f8-4608-a4d0-1ac5f76f37ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.182 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap712c0ef6-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.182 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.183 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap712c0ef6-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:26:00 compute-2 NetworkManager[48964]: <info>  [1769163960.1852] manager: (tap712c0ef6-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Jan 23 10:26:00 compute-2 kernel: tap712c0ef6-f0: entered promiscuous mode
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.185 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.188 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap712c0ef6-f0, col_values=(('external_ids', {'iface-id': '6c333384-cae4-4f40-8b56-257e8d961c46'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:26:00 compute-2 ovn_controller[132789]: 2026-01-23T10:26:00Z|00059|binding|INFO|Releasing lport 6c333384-cae4-4f40-8b56-257e8d961c46 from this chassis (sb_readonly=0)
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.190 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.191 142606 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/712c0ef6-fbbe-4577-b44d-9610116b414a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/712c0ef6-fbbe-4577-b44d-9610116b414a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.192 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[a348a34b-c9f1-43b4-883d-131cf3efb915]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.193 142606 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]: global
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]:     log         /dev/log local0 debug
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]:     log-tag     haproxy-metadata-proxy-712c0ef6-fbbe-4577-b44d-9610116b414a
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]:     user        root
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]:     group       root
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]:     maxconn     1024
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]:     pidfile     /var/lib/neutron/external/pids/712c0ef6-fbbe-4577-b44d-9610116b414a.pid.haproxy
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]:     daemon
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]: 
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]: defaults
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]:     log global
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]:     mode http
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]:     option httplog
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]:     option dontlognull
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]:     option http-server-close
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]:     option forwardfor
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]:     retries                 3
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]:     timeout http-request    30s
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]:     timeout connect         30s
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]:     timeout client          32s
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]:     timeout server          32s
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]:     timeout http-keep-alive 30s
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]: 
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]: 
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]: listen listener
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]:     bind 169.254.169.254:80
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]:     http-request add-header X-OVN-Network-ID 712c0ef6-fbbe-4577-b44d-9610116b414a
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 10:26:00 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.194 142606 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a', 'env', 'PROCESS_TAG=haproxy-712c0ef6-fbbe-4577-b44d-9610116b414a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/712c0ef6-fbbe-4577-b44d-9610116b414a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.204 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.525 225706 DEBUG nova.virt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Emitting event <LifecycleEvent: 1769163960.5252264, 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.526 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] VM Started (Lifecycle Event)
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.544 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.547 225706 DEBUG nova.virt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Emitting event <LifecycleEvent: 1769163960.5264301, 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.547 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] VM Paused (Lifecycle Event)
Jan 23 10:26:00 compute-2 podman[237034]: 2026-01-23 10:26:00.549276532 +0000 UTC m=+0.051905426 container create 46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.565 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.570 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.590 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 10:26:00 compute-2 systemd[1]: Started libpod-conmon-46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3.scope.
Jan 23 10:26:00 compute-2 podman[237034]: 2026-01-23 10:26:00.521598282 +0000 UTC m=+0.024227196 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 10:26:00 compute-2 systemd[1]: Started libcrun container.
Jan 23 10:26:00 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/455af21eacddd3d5239de182b4e4b79fd4186593d5fa50aaea2fa48c1d2e0bce/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 10:26:00 compute-2 podman[237034]: 2026-01-23 10:26:00.641279041 +0000 UTC m=+0.143907965 container init 46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 10:26:00 compute-2 podman[237034]: 2026-01-23 10:26:00.647097833 +0000 UTC m=+0.149726717 container start 46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:26:00 compute-2 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237049]: [NOTICE]   (237053) : New worker (237055) forked
Jan 23 10:26:00 compute-2 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237049]: [NOTICE]   (237053) : Loading success.
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.671 225706 DEBUG nova.compute.manager [req-4509fc4f-a764-4505-9342-0e88d88fd085 req-c9013843-cfb4-4797-aaad-d1cbef9bc734 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.672 225706 DEBUG oslo_concurrency.lockutils [req-4509fc4f-a764-4505-9342-0e88d88fd085 req-c9013843-cfb4-4797-aaad-d1cbef9bc734 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.672 225706 DEBUG oslo_concurrency.lockutils [req-4509fc4f-a764-4505-9342-0e88d88fd085 req-c9013843-cfb4-4797-aaad-d1cbef9bc734 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.673 225706 DEBUG oslo_concurrency.lockutils [req-4509fc4f-a764-4505-9342-0e88d88fd085 req-c9013843-cfb4-4797-aaad-d1cbef9bc734 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.673 225706 DEBUG nova.compute.manager [req-4509fc4f-a764-4505-9342-0e88d88fd085 req-c9013843-cfb4-4797-aaad-d1cbef9bc734 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Processing event network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.674 225706 DEBUG nova.compute.manager [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.678 225706 DEBUG nova.virt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Emitting event <LifecycleEvent: 1769163960.6787412, 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.679 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] VM Resumed (Lifecycle Event)
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.681 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.684 225706 INFO nova.virt.libvirt.driver [-] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Instance spawned successfully.
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.685 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.702 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.709 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.712 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.713 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.714 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.714 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.715 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.715 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.741 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.782 225706 INFO nova.compute.manager [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Took 6.33 seconds to spawn the instance on the hypervisor.
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.783 225706 DEBUG nova.compute.manager [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.845 225706 INFO nova.compute.manager [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Took 7.26 seconds to build instance.
Jan 23 10:26:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:00 compute-2 nova_compute[225701]: 2026-01-23 10:26:00.865 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.343s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:26:00 compute-2 ceph-mon[75771]: pgmap v982: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:26:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:01.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:26:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:01.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:26:01 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:26:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:02 compute-2 ceph-mon[75771]: pgmap v983: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Jan 23 10:26:02 compute-2 nova_compute[225701]: 2026-01-23 10:26:02.727 225706 DEBUG nova.compute.manager [req-636eccae-31a8-4d72-8c60-5a750ae84797 req-d049e572-b532-49c6-b7c2-e511370e4243 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:26:02 compute-2 nova_compute[225701]: 2026-01-23 10:26:02.728 225706 DEBUG oslo_concurrency.lockutils [req-636eccae-31a8-4d72-8c60-5a750ae84797 req-d049e572-b532-49c6-b7c2-e511370e4243 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:26:02 compute-2 nova_compute[225701]: 2026-01-23 10:26:02.728 225706 DEBUG oslo_concurrency.lockutils [req-636eccae-31a8-4d72-8c60-5a750ae84797 req-d049e572-b532-49c6-b7c2-e511370e4243 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:26:02 compute-2 nova_compute[225701]: 2026-01-23 10:26:02.729 225706 DEBUG oslo_concurrency.lockutils [req-636eccae-31a8-4d72-8c60-5a750ae84797 req-d049e572-b532-49c6-b7c2-e511370e4243 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:26:02 compute-2 nova_compute[225701]: 2026-01-23 10:26:02.729 225706 DEBUG nova.compute.manager [req-636eccae-31a8-4d72-8c60-5a750ae84797 req-d049e572-b532-49c6-b7c2-e511370e4243 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] No waiting events found dispatching network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:26:02 compute-2 nova_compute[225701]: 2026-01-23 10:26:02.729 225706 WARNING nova.compute.manager [req-636eccae-31a8-4d72-8c60-5a750ae84797 req-d049e572-b532-49c6-b7c2-e511370e4243 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received unexpected event network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 for instance with vm_state active and task_state None.
Jan 23 10:26:02 compute-2 nova_compute[225701]: 2026-01-23 10:26:02.779 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:26:02 compute-2 nova_compute[225701]: 2026-01-23 10:26:02.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:26:02 compute-2 nova_compute[225701]: 2026-01-23 10:26:02.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 23 10:26:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:03.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:03.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:03 compute-2 nova_compute[225701]: 2026-01-23 10:26:03.992 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:04 compute-2 nova_compute[225701]: 2026-01-23 10:26:04.057 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:04 compute-2 ceph-mon[75771]: pgmap v984: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Jan 23 10:26:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:05.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:05.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:26:05 compute-2 sudo[237068]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:26:05 compute-2 sudo[237068]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:26:05 compute-2 sudo[237068]: pam_unix(sudo:session): session closed for user root
Jan 23 10:26:05 compute-2 nova_compute[225701]: 2026-01-23 10:26:05.797 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:26:05 compute-2 nova_compute[225701]: 2026-01-23 10:26:05.798 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:26:05 compute-2 nova_compute[225701]: 2026-01-23 10:26:05.798 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:26:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:06 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:26:06 compute-2 nova_compute[225701]: 2026-01-23 10:26:06.801 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:26:06 compute-2 nova_compute[225701]: 2026-01-23 10:26:06.801 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquired lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:26:06 compute-2 nova_compute[225701]: 2026-01-23 10:26:06.801 225706 DEBUG nova.network.neutron [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 10:26:06 compute-2 nova_compute[225701]: 2026-01-23 10:26:06.802 225706 DEBUG nova.objects.instance [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:26:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:07.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:07 compute-2 ceph-mon[75771]: pgmap v985: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Jan 23 10:26:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:07.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:07 compute-2 nova_compute[225701]: 2026-01-23 10:26:07.740 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:07 compute-2 NetworkManager[48964]: <info>  [1769163967.7443] manager: (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Jan 23 10:26:07 compute-2 NetworkManager[48964]: <info>  [1769163967.7453] manager: (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Jan 23 10:26:07 compute-2 ovn_controller[132789]: 2026-01-23T10:26:07Z|00060|binding|INFO|Releasing lport 6c333384-cae4-4f40-8b56-257e8d961c46 from this chassis (sb_readonly=0)
Jan 23 10:26:07 compute-2 nova_compute[225701]: 2026-01-23 10:26:07.769 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:07 compute-2 ovn_controller[132789]: 2026-01-23T10:26:07Z|00061|binding|INFO|Releasing lport 6c333384-cae4-4f40-8b56-257e8d961c46 from this chassis (sb_readonly=0)
Jan 23 10:26:07 compute-2 nova_compute[225701]: 2026-01-23 10:26:07.774 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:08 compute-2 ceph-mon[75771]: pgmap v986: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Jan 23 10:26:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:08 compute-2 nova_compute[225701]: 2026-01-23 10:26:08.996 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:09.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:09 compute-2 nova_compute[225701]: 2026-01-23 10:26:09.051 225706 DEBUG nova.compute.manager [req-ae6960ed-8206-492c-8101-44d3a35d3066 req-4bb0b829-14d2-4eeb-90e9-3b2626ab5e4b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-changed-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:26:09 compute-2 nova_compute[225701]: 2026-01-23 10:26:09.052 225706 DEBUG nova.compute.manager [req-ae6960ed-8206-492c-8101-44d3a35d3066 req-4bb0b829-14d2-4eeb-90e9-3b2626ab5e4b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Refreshing instance network info cache due to event network-changed-2611e513-4316-4421-8b89-1c0f37157967. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 10:26:09 compute-2 nova_compute[225701]: 2026-01-23 10:26:09.052 225706 DEBUG oslo_concurrency.lockutils [req-ae6960ed-8206-492c-8101-44d3a35d3066 req-4bb0b829-14d2-4eeb-90e9-3b2626ab5e4b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:26:09 compute-2 nova_compute[225701]: 2026-01-23 10:26:09.058 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:09.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:09 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1720359449' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:26:09 compute-2 nova_compute[225701]: 2026-01-23 10:26:09.562 225706 DEBUG nova.network.neutron [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updating instance_info_cache with network_info: [{"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:26:09 compute-2 nova_compute[225701]: 2026-01-23 10:26:09.591 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Releasing lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:26:09 compute-2 nova_compute[225701]: 2026-01-23 10:26:09.591 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 10:26:09 compute-2 nova_compute[225701]: 2026-01-23 10:26:09.592 225706 DEBUG oslo_concurrency.lockutils [req-ae6960ed-8206-492c-8101-44d3a35d3066 req-4bb0b829-14d2-4eeb-90e9-3b2626ab5e4b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:26:09 compute-2 nova_compute[225701]: 2026-01-23 10:26:09.592 225706 DEBUG nova.network.neutron [req-ae6960ed-8206-492c-8101-44d3a35d3066 req-4bb0b829-14d2-4eeb-90e9-3b2626ab5e4b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Refreshing network info cache for port 2611e513-4316-4421-8b89-1c0f37157967 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 10:26:09 compute-2 nova_compute[225701]: 2026-01-23 10:26:09.593 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:26:09 compute-2 nova_compute[225701]: 2026-01-23 10:26:09.788 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:26:09 compute-2 nova_compute[225701]: 2026-01-23 10:26:09.809 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:26:09 compute-2 nova_compute[225701]: 2026-01-23 10:26:09.811 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:26:09 compute-2 nova_compute[225701]: 2026-01-23 10:26:09.811 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:26:09 compute-2 nova_compute[225701]: 2026-01-23 10:26:09.811 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:26:09 compute-2 nova_compute[225701]: 2026-01-23 10:26:09.812 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:26:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:10 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:26:10 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1268700137' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:26:10 compute-2 nova_compute[225701]: 2026-01-23 10:26:10.289 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:26:10 compute-2 nova_compute[225701]: 2026-01-23 10:26:10.437 225706 DEBUG nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] skipping disk for instance-0000000b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Jan 23 10:26:10 compute-2 nova_compute[225701]: 2026-01-23 10:26:10.438 225706 DEBUG nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] skipping disk for instance-0000000b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Jan 23 10:26:10 compute-2 ceph-mon[75771]: pgmap v987: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 23 10:26:10 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/80523321' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:26:10 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1268700137' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:26:10 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2269378824' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:26:10 compute-2 nova_compute[225701]: 2026-01-23 10:26:10.607 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:26:10 compute-2 nova_compute[225701]: 2026-01-23 10:26:10.608 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4731MB free_disk=59.96738052368164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:26:10 compute-2 nova_compute[225701]: 2026-01-23 10:26:10.609 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:26:10 compute-2 nova_compute[225701]: 2026-01-23 10:26:10.609 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:26:10 compute-2 sudo[237124]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:26:10 compute-2 sudo[237124]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:26:10 compute-2 sudo[237124]: pam_unix(sudo:session): session closed for user root
Jan 23 10:26:10 compute-2 nova_compute[225701]: 2026-01-23 10:26:10.725 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Instance 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 10:26:10 compute-2 nova_compute[225701]: 2026-01-23 10:26:10.725 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:26:10 compute-2 nova_compute[225701]: 2026-01-23 10:26:10.725 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:26:10 compute-2 sudo[237149]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:26:10 compute-2 sudo[237149]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:26:10 compute-2 nova_compute[225701]: 2026-01-23 10:26:10.791 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:26:10 compute-2 nova_compute[225701]: 2026-01-23 10:26:10.811 225706 DEBUG nova.network.neutron [req-ae6960ed-8206-492c-8101-44d3a35d3066 req-4bb0b829-14d2-4eeb-90e9-3b2626ab5e4b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updated VIF entry in instance network info cache for port 2611e513-4316-4421-8b89-1c0f37157967. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 10:26:10 compute-2 nova_compute[225701]: 2026-01-23 10:26:10.812 225706 DEBUG nova.network.neutron [req-ae6960ed-8206-492c-8101-44d3a35d3066 req-4bb0b829-14d2-4eeb-90e9-3b2626ab5e4b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updating instance_info_cache with network_info: [{"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:26:10 compute-2 nova_compute[225701]: 2026-01-23 10:26:10.826 225706 DEBUG oslo_concurrency.lockutils [req-ae6960ed-8206-492c-8101-44d3a35d3066 req-4bb0b829-14d2-4eeb-90e9-3b2626ab5e4b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:26:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:11.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:11 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:26:11 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/311252267' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:26:11 compute-2 nova_compute[225701]: 2026-01-23 10:26:11.246 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:26:11 compute-2 nova_compute[225701]: 2026-01-23 10:26:11.252 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:26:11 compute-2 nova_compute[225701]: 2026-01-23 10:26:11.267 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:26:11 compute-2 nova_compute[225701]: 2026-01-23 10:26:11.306 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:26:11 compute-2 nova_compute[225701]: 2026-01-23 10:26:11.306 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.697s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:26:11 compute-2 sudo[237149]: pam_unix(sudo:session): session closed for user root
Jan 23 10:26:11 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:26:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:11.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:11 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/311252267' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:26:11 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/304457320' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:26:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:13.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:13.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:13 compute-2 ceph-mon[75771]: pgmap v988: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 23 10:26:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:13 compute-2 nova_compute[225701]: 2026-01-23 10:26:13.998 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:14 compute-2 nova_compute[225701]: 2026-01-23 10:26:14.060 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:14 compute-2 nova_compute[225701]: 2026-01-23 10:26:14.302 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:26:14 compute-2 nova_compute[225701]: 2026-01-23 10:26:14.303 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:26:14 compute-2 nova_compute[225701]: 2026-01-23 10:26:14.303 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:26:14 compute-2 nova_compute[225701]: 2026-01-23 10:26:14.303 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:26:14 compute-2 nova_compute[225701]: 2026-01-23 10:26:14.303 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:26:14 compute-2 nova_compute[225701]: 2026-01-23 10:26:14.303 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:26:14 compute-2 ceph-mon[75771]: pgmap v989: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 64 op/s
Jan 23 10:26:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:14 compute-2 ovn_controller[132789]: 2026-01-23T10:26:14Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:58:a8:f2 10.100.0.13
Jan 23 10:26:14 compute-2 ovn_controller[132789]: 2026-01-23T10:26:14Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:58:a8:f2 10.100.0.13
Jan 23 10:26:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:15.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:15.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:15 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:26:15 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:26:15 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1174453001' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:26:15 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:26:15 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:26:15 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:26:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:16 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:26:16 compute-2 ceph-mon[75771]: pgmap v990: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 64 op/s
Jan 23 10:26:16 compute-2 ceph-mon[75771]: pgmap v991: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 75 op/s
Jan 23 10:26:16 compute-2 ceph-mon[75771]: pgmap v992: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 372 B/s rd, 0 op/s
Jan 23 10:26:16 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:26:16 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:26:16 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:26:16 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:26:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:17.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:17.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:18 compute-2 ceph-mon[75771]: pgmap v993: 353 pgs: 353 active+clean; 167 MiB data, 325 MiB used, 60 GiB / 60 GiB avail; 562 KiB/s rd, 5.7 MiB/s wr, 131 op/s
Jan 23 10:26:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:19 compute-2 nova_compute[225701]: 2026-01-23 10:26:19.000 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:26:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:19.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:26:19 compute-2 nova_compute[225701]: 2026-01-23 10:26:19.061 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:19.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:20 compute-2 sudo[237235]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:26:20 compute-2 sudo[237235]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:26:20 compute-2 sudo[237235]: pam_unix(sudo:session): session closed for user root
Jan 23 10:26:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:20 compute-2 ceph-mon[75771]: pgmap v994: 353 pgs: 353 active+clean; 167 MiB data, 325 MiB used, 60 GiB / 60 GiB avail; 562 KiB/s rd, 5.7 MiB/s wr, 131 op/s
Jan 23 10:26:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:26:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:26:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:26:20 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1766751452' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:26:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:21.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:21 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:26:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:21.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:21 compute-2 nova_compute[225701]: 2026-01-23 10:26:21.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:26:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:21 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1512136197' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:26:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:22 compute-2 ceph-mon[75771]: pgmap v995: 353 pgs: 353 active+clean; 167 MiB data, 335 MiB used, 60 GiB / 60 GiB avail; 567 KiB/s rd, 5.7 MiB/s wr, 135 op/s
Jan 23 10:26:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:23.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:23.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:24 compute-2 nova_compute[225701]: 2026-01-23 10:26:24.002 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:24 compute-2 nova_compute[225701]: 2026-01-23 10:26:24.062 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:24 compute-2 ceph-mon[75771]: pgmap v996: 353 pgs: 353 active+clean; 167 MiB data, 335 MiB used, 60 GiB / 60 GiB avail; 567 KiB/s rd, 5.7 MiB/s wr, 135 op/s
Jan 23 10:26:24 compute-2 nova_compute[225701]: 2026-01-23 10:26:24.803 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:26:24 compute-2 nova_compute[225701]: 2026-01-23 10:26:24.803 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 23 10:26:24 compute-2 nova_compute[225701]: 2026-01-23 10:26:24.820 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 23 10:26:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:25.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:25.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:25 compute-2 sudo[237267]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:26:25 compute-2 sudo[237267]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:26:25 compute-2 sudo[237267]: pam_unix(sudo:session): session closed for user root
Jan 23 10:26:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:26 compute-2 ceph-mon[75771]: pgmap v997: 353 pgs: 353 active+clean; 167 MiB data, 335 MiB used, 60 GiB / 60 GiB avail; 467 KiB/s rd, 4.7 MiB/s wr, 112 op/s
Jan 23 10:26:26 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:26:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:27.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:27.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:27 compute-2 podman[237295]: 2026-01-23 10:26:27.634602064 +0000 UTC m=+0.054721955 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 10:26:27 compute-2 podman[237294]: 2026-01-23 10:26:27.664635581 +0000 UTC m=+0.084562767 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 10:26:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:28 compute-2 ceph-mon[75771]: pgmap v998: 353 pgs: 353 active+clean; 167 MiB data, 335 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 3.9 MiB/s wr, 167 op/s
Jan 23 10:26:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:29 compute-2 nova_compute[225701]: 2026-01-23 10:26:29.004 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:29 compute-2 nova_compute[225701]: 2026-01-23 10:26:29.064 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:29.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:29.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:30 compute-2 ceph-mon[75771]: pgmap v999: 353 pgs: 353 active+clean; 167 MiB data, 335 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 25 KiB/s wr, 77 op/s
Jan 23 10:26:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:31.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:31 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:26:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:31.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:32 compute-2 ceph-mon[75771]: pgmap v1000: 353 pgs: 353 active+clean; 167 MiB data, 335 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 25 KiB/s wr, 77 op/s
Jan 23 10:26:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:33.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:33.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:34 compute-2 nova_compute[225701]: 2026-01-23 10:26:34.005 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:34 compute-2 nova_compute[225701]: 2026-01-23 10:26:34.066 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:34 compute-2 ceph-mon[75771]: pgmap v1001: 353 pgs: 353 active+clean; 167 MiB data, 335 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 23 10:26:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:35.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:35.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:26:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:36 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:26:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:36 compute-2 ceph-mon[75771]: pgmap v1002: 353 pgs: 353 active+clean; 167 MiB data, 335 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 23 10:26:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:37.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:37.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:38 compute-2 ceph-mon[75771]: pgmap v1003: 353 pgs: 353 active+clean; 188 MiB data, 348 MiB used, 60 GiB / 60 GiB avail; 2.1 MiB/s rd, 2.0 MiB/s wr, 117 op/s
Jan 23 10:26:39 compute-2 nova_compute[225701]: 2026-01-23 10:26:39.009 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:39 compute-2 nova_compute[225701]: 2026-01-23 10:26:39.067 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:39.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:39.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:40 compute-2 ceph-mon[75771]: pgmap v1004: 353 pgs: 353 active+clean; 188 MiB data, 348 MiB used, 60 GiB / 60 GiB avail; 237 KiB/s rd, 2.0 MiB/s wr, 43 op/s
Jan 23 10:26:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:41.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:41 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:26:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:41.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:42 compute-2 ceph-mon[75771]: pgmap v1005: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 391 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Jan 23 10:26:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:43.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:43.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:44 compute-2 nova_compute[225701]: 2026-01-23 10:26:44.012 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:44 compute-2 nova_compute[225701]: 2026-01-23 10:26:44.068 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:44 compute-2 ceph-mon[75771]: pgmap v1006: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 391 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Jan 23 10:26:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:44 compute-2 nova_compute[225701]: 2026-01-23 10:26:44.949 225706 INFO nova.compute.manager [None req-20215e91-4413-4a39-aa3e-9cf7fd1b6aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Get console output
Jan 23 10:26:44 compute-2 nova_compute[225701]: 2026-01-23 10:26:44.958 225706 INFO oslo.privsep.daemon [None req-20215e91-4413-4a39-aa3e-9cf7fd1b6aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpm1dfunpz/privsep.sock']
Jan 23 10:26:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:45.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:45.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:45 compute-2 nova_compute[225701]: 2026-01-23 10:26:45.721 225706 INFO oslo.privsep.daemon [None req-20215e91-4413-4a39-aa3e-9cf7fd1b6aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Spawned new privsep daemon via rootwrap
Jan 23 10:26:45 compute-2 nova_compute[225701]: 2026-01-23 10:26:45.601 237361 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 23 10:26:45 compute-2 nova_compute[225701]: 2026-01-23 10:26:45.608 237361 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 23 10:26:45 compute-2 nova_compute[225701]: 2026-01-23 10:26:45.612 237361 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 23 10:26:45 compute-2 nova_compute[225701]: 2026-01-23 10:26:45.613 237361 INFO oslo.privsep.daemon [-] privsep daemon running as pid 237361
Jan 23 10:26:45 compute-2 nova_compute[225701]: 2026-01-23 10:26:45.817 237361 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 23 10:26:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:45 compute-2 sudo[237363]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:26:45 compute-2 sudo[237363]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:26:45 compute-2 sudo[237363]: pam_unix(sudo:session): session closed for user root
Jan 23 10:26:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:46 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:26:46 compute-2 ceph-mon[75771]: pgmap v1007: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 391 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Jan 23 10:26:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:47 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:26:46.999 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:26:47 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:26:47.000 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:26:47 compute-2 nova_compute[225701]: 2026-01-23 10:26:47.001 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:47.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:47 compute-2 nova_compute[225701]: 2026-01-23 10:26:47.135 225706 DEBUG nova.compute.manager [req-c628b317-9e70-4683-a16f-849b7adbce6c req-6cf54849-65db-409d-bde2-2ef17ae8f864 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-changed-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:26:47 compute-2 nova_compute[225701]: 2026-01-23 10:26:47.136 225706 DEBUG nova.compute.manager [req-c628b317-9e70-4683-a16f-849b7adbce6c req-6cf54849-65db-409d-bde2-2ef17ae8f864 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Refreshing instance network info cache due to event network-changed-2611e513-4316-4421-8b89-1c0f37157967. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 10:26:47 compute-2 nova_compute[225701]: 2026-01-23 10:26:47.136 225706 DEBUG oslo_concurrency.lockutils [req-c628b317-9e70-4683-a16f-849b7adbce6c req-6cf54849-65db-409d-bde2-2ef17ae8f864 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:26:47 compute-2 nova_compute[225701]: 2026-01-23 10:26:47.137 225706 DEBUG oslo_concurrency.lockutils [req-c628b317-9e70-4683-a16f-849b7adbce6c req-6cf54849-65db-409d-bde2-2ef17ae8f864 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:26:47 compute-2 nova_compute[225701]: 2026-01-23 10:26:47.137 225706 DEBUG nova.network.neutron [req-c628b317-9e70-4683-a16f-849b7adbce6c req-6cf54849-65db-409d-bde2-2ef17ae8f864 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Refreshing network info cache for port 2611e513-4316-4421-8b89-1c0f37157967 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 10:26:47 compute-2 nova_compute[225701]: 2026-01-23 10:26:47.172 225706 DEBUG nova.compute.manager [req-6ccae6f0-7623-4035-a9eb-51b29165d9a0 req-c44a50bc-40ef-415a-aa20-2daf76f3a4cb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-vif-unplugged-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:26:47 compute-2 nova_compute[225701]: 2026-01-23 10:26:47.173 225706 DEBUG oslo_concurrency.lockutils [req-6ccae6f0-7623-4035-a9eb-51b29165d9a0 req-c44a50bc-40ef-415a-aa20-2daf76f3a4cb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:26:47 compute-2 nova_compute[225701]: 2026-01-23 10:26:47.174 225706 DEBUG oslo_concurrency.lockutils [req-6ccae6f0-7623-4035-a9eb-51b29165d9a0 req-c44a50bc-40ef-415a-aa20-2daf76f3a4cb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:26:47 compute-2 nova_compute[225701]: 2026-01-23 10:26:47.174 225706 DEBUG oslo_concurrency.lockutils [req-6ccae6f0-7623-4035-a9eb-51b29165d9a0 req-c44a50bc-40ef-415a-aa20-2daf76f3a4cb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:26:47 compute-2 nova_compute[225701]: 2026-01-23 10:26:47.174 225706 DEBUG nova.compute.manager [req-6ccae6f0-7623-4035-a9eb-51b29165d9a0 req-c44a50bc-40ef-415a-aa20-2daf76f3a4cb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] No waiting events found dispatching network-vif-unplugged-2611e513-4316-4421-8b89-1c0f37157967 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:26:47 compute-2 nova_compute[225701]: 2026-01-23 10:26:47.175 225706 WARNING nova.compute.manager [req-6ccae6f0-7623-4035-a9eb-51b29165d9a0 req-c44a50bc-40ef-415a-aa20-2daf76f3a4cb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received unexpected event network-vif-unplugged-2611e513-4316-4421-8b89-1c0f37157967 for instance with vm_state active and task_state None.
Jan 23 10:26:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:47.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:48 compute-2 nova_compute[225701]: 2026-01-23 10:26:48.163 225706 INFO nova.compute.manager [None req-e98b0cfc-3a11-43be-ab3f-f7a210467de3 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Get console output
Jan 23 10:26:48 compute-2 nova_compute[225701]: 2026-01-23 10:26:48.170 237361 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 23 10:26:48 compute-2 ceph-mon[75771]: pgmap v1008: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 392 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Jan 23 10:26:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:49 compute-2 nova_compute[225701]: 2026-01-23 10:26:49.014 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:49 compute-2 nova_compute[225701]: 2026-01-23 10:26:49.069 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:49.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:49.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:49 compute-2 nova_compute[225701]: 2026-01-23 10:26:49.590 225706 DEBUG nova.compute.manager [req-ff21e5ca-3cb7-4ac8-98a1-ec15be70ee71 req-729e187b-e506-468f-bbc4-6ad9221eee13 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:26:49 compute-2 nova_compute[225701]: 2026-01-23 10:26:49.591 225706 DEBUG oslo_concurrency.lockutils [req-ff21e5ca-3cb7-4ac8-98a1-ec15be70ee71 req-729e187b-e506-468f-bbc4-6ad9221eee13 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:26:49 compute-2 nova_compute[225701]: 2026-01-23 10:26:49.591 225706 DEBUG oslo_concurrency.lockutils [req-ff21e5ca-3cb7-4ac8-98a1-ec15be70ee71 req-729e187b-e506-468f-bbc4-6ad9221eee13 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:26:49 compute-2 nova_compute[225701]: 2026-01-23 10:26:49.591 225706 DEBUG oslo_concurrency.lockutils [req-ff21e5ca-3cb7-4ac8-98a1-ec15be70ee71 req-729e187b-e506-468f-bbc4-6ad9221eee13 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:26:49 compute-2 nova_compute[225701]: 2026-01-23 10:26:49.592 225706 DEBUG nova.compute.manager [req-ff21e5ca-3cb7-4ac8-98a1-ec15be70ee71 req-729e187b-e506-468f-bbc4-6ad9221eee13 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] No waiting events found dispatching network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:26:49 compute-2 nova_compute[225701]: 2026-01-23 10:26:49.592 225706 WARNING nova.compute.manager [req-ff21e5ca-3cb7-4ac8-98a1-ec15be70ee71 req-729e187b-e506-468f-bbc4-6ad9221eee13 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received unexpected event network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 for instance with vm_state active and task_state None.
Jan 23 10:26:49 compute-2 nova_compute[225701]: 2026-01-23 10:26:49.718 225706 DEBUG nova.network.neutron [req-c628b317-9e70-4683-a16f-849b7adbce6c req-6cf54849-65db-409d-bde2-2ef17ae8f864 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updated VIF entry in instance network info cache for port 2611e513-4316-4421-8b89-1c0f37157967. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 10:26:49 compute-2 nova_compute[225701]: 2026-01-23 10:26:49.719 225706 DEBUG nova.network.neutron [req-c628b317-9e70-4683-a16f-849b7adbce6c req-6cf54849-65db-409d-bde2-2ef17ae8f864 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updating instance_info_cache with network_info: [{"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:26:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/2133716442' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:26:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/2133716442' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:26:49 compute-2 nova_compute[225701]: 2026-01-23 10:26:49.958 225706 DEBUG oslo_concurrency.lockutils [req-c628b317-9e70-4683-a16f-849b7adbce6c req-6cf54849-65db-409d-bde2-2ef17ae8f864 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:26:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:51 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:26:51.002 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:26:51 compute-2 nova_compute[225701]: 2026-01-23 10:26:51.090 225706 DEBUG nova.compute.manager [req-1e2a0c27-9ed7-4ce9-9771-49e476588684 req-33df6016-117e-47bf-a5bd-f84e7ec168c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-changed-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:26:51 compute-2 nova_compute[225701]: 2026-01-23 10:26:51.091 225706 DEBUG nova.compute.manager [req-1e2a0c27-9ed7-4ce9-9771-49e476588684 req-33df6016-117e-47bf-a5bd-f84e7ec168c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Refreshing instance network info cache due to event network-changed-2611e513-4316-4421-8b89-1c0f37157967. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 10:26:51 compute-2 nova_compute[225701]: 2026-01-23 10:26:51.091 225706 DEBUG oslo_concurrency.lockutils [req-1e2a0c27-9ed7-4ce9-9771-49e476588684 req-33df6016-117e-47bf-a5bd-f84e7ec168c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:26:51 compute-2 nova_compute[225701]: 2026-01-23 10:26:51.091 225706 DEBUG oslo_concurrency.lockutils [req-1e2a0c27-9ed7-4ce9-9771-49e476588684 req-33df6016-117e-47bf-a5bd-f84e7ec168c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:26:51 compute-2 nova_compute[225701]: 2026-01-23 10:26:51.091 225706 DEBUG nova.network.neutron [req-1e2a0c27-9ed7-4ce9-9771-49e476588684 req-33df6016-117e-47bf-a5bd-f84e7ec168c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Refreshing network info cache for port 2611e513-4316-4421-8b89-1c0f37157967 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 10:26:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:51.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:51 compute-2 ceph-mon[75771]: pgmap v1009: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 155 KiB/s rd, 107 KiB/s wr, 23 op/s
Jan 23 10:26:51 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:26:51 compute-2 nova_compute[225701]: 2026-01-23 10:26:51.101 225706 INFO nova.compute.manager [None req-588c94de-092b-4d0f-8588-d7670846f3f7 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Get console output
Jan 23 10:26:51 compute-2 nova_compute[225701]: 2026-01-23 10:26:51.105 237361 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 23 10:26:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:51 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:26:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:51.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:51 compute-2 nova_compute[225701]: 2026-01-23 10:26:51.699 225706 DEBUG nova.compute.manager [req-2bb8696d-7016-4e61-96a9-a82bc05e8e60 req-f29f1513-d6db-4258-b0a1-5b08975efa17 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:26:51 compute-2 nova_compute[225701]: 2026-01-23 10:26:51.700 225706 DEBUG oslo_concurrency.lockutils [req-2bb8696d-7016-4e61-96a9-a82bc05e8e60 req-f29f1513-d6db-4258-b0a1-5b08975efa17 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:26:51 compute-2 nova_compute[225701]: 2026-01-23 10:26:51.700 225706 DEBUG oslo_concurrency.lockutils [req-2bb8696d-7016-4e61-96a9-a82bc05e8e60 req-f29f1513-d6db-4258-b0a1-5b08975efa17 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:26:51 compute-2 nova_compute[225701]: 2026-01-23 10:26:51.701 225706 DEBUG oslo_concurrency.lockutils [req-2bb8696d-7016-4e61-96a9-a82bc05e8e60 req-f29f1513-d6db-4258-b0a1-5b08975efa17 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:26:51 compute-2 nova_compute[225701]: 2026-01-23 10:26:51.701 225706 DEBUG nova.compute.manager [req-2bb8696d-7016-4e61-96a9-a82bc05e8e60 req-f29f1513-d6db-4258-b0a1-5b08975efa17 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] No waiting events found dispatching network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:26:51 compute-2 nova_compute[225701]: 2026-01-23 10:26:51.702 225706 WARNING nova.compute.manager [req-2bb8696d-7016-4e61-96a9-a82bc05e8e60 req-f29f1513-d6db-4258-b0a1-5b08975efa17 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received unexpected event network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 for instance with vm_state active and task_state None.
Jan 23 10:26:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:52 compute-2 ceph-mon[75771]: pgmap v1010: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 162 KiB/s rd, 112 KiB/s wr, 24 op/s
Jan 23 10:26:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:26:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:53.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:26:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:53 compute-2 nova_compute[225701]: 2026-01-23 10:26:53.319 225706 DEBUG nova.network.neutron [req-1e2a0c27-9ed7-4ce9-9771-49e476588684 req-33df6016-117e-47bf-a5bd-f84e7ec168c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updated VIF entry in instance network info cache for port 2611e513-4316-4421-8b89-1c0f37157967. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 10:26:53 compute-2 nova_compute[225701]: 2026-01-23 10:26:53.319 225706 DEBUG nova.network.neutron [req-1e2a0c27-9ed7-4ce9-9771-49e476588684 req-33df6016-117e-47bf-a5bd-f84e7ec168c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updating instance_info_cache with network_info: [{"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:26:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:53.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:53 compute-2 nova_compute[225701]: 2026-01-23 10:26:53.384 225706 DEBUG oslo_concurrency.lockutils [req-1e2a0c27-9ed7-4ce9-9771-49e476588684 req-33df6016-117e-47bf-a5bd-f84e7ec168c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:26:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:53 compute-2 nova_compute[225701]: 2026-01-23 10:26:53.988 225706 DEBUG nova.compute.manager [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:26:53 compute-2 nova_compute[225701]: 2026-01-23 10:26:53.988 225706 DEBUG oslo_concurrency.lockutils [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:26:53 compute-2 nova_compute[225701]: 2026-01-23 10:26:53.990 225706 DEBUG oslo_concurrency.lockutils [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:26:53 compute-2 nova_compute[225701]: 2026-01-23 10:26:53.990 225706 DEBUG oslo_concurrency.lockutils [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:26:53 compute-2 nova_compute[225701]: 2026-01-23 10:26:53.991 225706 DEBUG nova.compute.manager [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] No waiting events found dispatching network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:26:53 compute-2 nova_compute[225701]: 2026-01-23 10:26:53.991 225706 WARNING nova.compute.manager [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received unexpected event network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 for instance with vm_state active and task_state None.
Jan 23 10:26:54 compute-2 nova_compute[225701]: 2026-01-23 10:26:54.016 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:54 compute-2 nova_compute[225701]: 2026-01-23 10:26:54.070 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:55.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:55.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:55 compute-2 ceph-mon[75771]: pgmap v1011: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 7.2 KiB/s rd, 17 KiB/s wr, 2 op/s
Jan 23 10:26:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:26:55.494 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:26:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:26:55.495 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:26:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:26:55.496 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:26:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:56 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:26:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:57.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:57 compute-2 ceph-mon[75771]: pgmap v1012: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 7.2 KiB/s rd, 17 KiB/s wr, 2 op/s
Jan 23 10:26:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:26:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:57.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:26:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:58 compute-2 ceph-mon[75771]: pgmap v1013: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 13 KiB/s rd, 19 KiB/s wr, 8 op/s
Jan 23 10:26:58 compute-2 podman[237403]: 2026-01-23 10:26:58.632667957 +0000 UTC m=+0.052243994 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 10:26:58 compute-2 podman[237402]: 2026-01-23 10:26:58.68779159 +0000 UTC m=+0.101725898 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 10:26:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:59 compute-2 nova_compute[225701]: 2026-01-23 10:26:59.017 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:59 compute-2 nova_compute[225701]: 2026-01-23 10:26:59.072 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:59.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:26:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:26:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:26:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:59.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:26:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:01.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:27:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:01.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:27:01 compute-2 ceph-mon[75771]: pgmap v1014: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 12 KiB/s rd, 6.7 KiB/s wr, 8 op/s
Jan 23 10:27:01 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:27:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:03.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:03 compute-2 ceph-mon[75771]: pgmap v1015: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 26 KiB/s rd, 17 KiB/s wr, 31 op/s
Jan 23 10:27:03 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/138703590' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:27:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:27:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:03.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:27:03 compute-2 nova_compute[225701]: 2026-01-23 10:27:03.796 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:27:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:04 compute-2 nova_compute[225701]: 2026-01-23 10:27:04.019 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:04 compute-2 nova_compute[225701]: 2026-01-23 10:27:04.072 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:05 compute-2 ceph-mon[75771]: pgmap v1016: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 13 KiB/s wr, 29 op/s
Jan 23 10:27:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:27:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:05.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:27:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:05.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:05 compute-2 nova_compute[225701]: 2026-01-23 10:27:05.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:27:05 compute-2 nova_compute[225701]: 2026-01-23 10:27:05.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:27:05 compute-2 nova_compute[225701]: 2026-01-23 10:27:05.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:27:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:06 compute-2 sudo[237457]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:27:06 compute-2 sudo[237457]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:27:06 compute-2 sudo[237457]: pam_unix(sudo:session): session closed for user root
Jan 23 10:27:06 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:27:06 compute-2 nova_compute[225701]: 2026-01-23 10:27:06.793 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:27:06 compute-2 nova_compute[225701]: 2026-01-23 10:27:06.793 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquired lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:27:06 compute-2 nova_compute[225701]: 2026-01-23 10:27:06.794 225706 DEBUG nova.network.neutron [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 10:27:06 compute-2 nova_compute[225701]: 2026-01-23 10:27:06.794 225706 DEBUG nova.objects.instance [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:27:06 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:27:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:07.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:27:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:07.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:27:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:08 compute-2 ceph-mon[75771]: pgmap v1017: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 13 KiB/s wr, 29 op/s
Jan 23 10:27:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:09 compute-2 nova_compute[225701]: 2026-01-23 10:27:09.036 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:09 compute-2 nova_compute[225701]: 2026-01-23 10:27:09.074 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.002000048s ======
Jan 23 10:27:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:09.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000048s
Jan 23 10:27:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:27:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:09.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:27:09 compute-2 nova_compute[225701]: 2026-01-23 10:27:09.593 225706 DEBUG nova.compute.manager [req-3f3650be-0f26-4683-9b85-8b0b85d51374 req-aabc123c-b5dd-4d4b-94cc-5989201ba0f8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-changed-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:27:09 compute-2 nova_compute[225701]: 2026-01-23 10:27:09.593 225706 DEBUG nova.compute.manager [req-3f3650be-0f26-4683-9b85-8b0b85d51374 req-aabc123c-b5dd-4d4b-94cc-5989201ba0f8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Refreshing instance network info cache due to event network-changed-2611e513-4316-4421-8b89-1c0f37157967. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 10:27:09 compute-2 nova_compute[225701]: 2026-01-23 10:27:09.593 225706 DEBUG oslo_concurrency.lockutils [req-3f3650be-0f26-4683-9b85-8b0b85d51374 req-aabc123c-b5dd-4d4b-94cc-5989201ba0f8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:27:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:10 compute-2 nova_compute[225701]: 2026-01-23 10:27:10.206 225706 DEBUG oslo_concurrency.lockutils [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:27:10 compute-2 nova_compute[225701]: 2026-01-23 10:27:10.206 225706 DEBUG oslo_concurrency.lockutils [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:27:10 compute-2 nova_compute[225701]: 2026-01-23 10:27:10.207 225706 DEBUG oslo_concurrency.lockutils [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:27:10 compute-2 nova_compute[225701]: 2026-01-23 10:27:10.207 225706 DEBUG oslo_concurrency.lockutils [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:27:10 compute-2 nova_compute[225701]: 2026-01-23 10:27:10.207 225706 DEBUG oslo_concurrency.lockutils [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:27:10 compute-2 nova_compute[225701]: 2026-01-23 10:27:10.208 225706 INFO nova.compute.manager [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Terminating instance
Jan 23 10:27:10 compute-2 nova_compute[225701]: 2026-01-23 10:27:10.209 225706 DEBUG nova.compute.manager [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 10:27:10 compute-2 nova_compute[225701]: 2026-01-23 10:27:10.212 225706 DEBUG nova.network.neutron [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updating instance_info_cache with network_info: [{"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:27:10 compute-2 nova_compute[225701]: 2026-01-23 10:27:10.236 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Releasing lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:27:10 compute-2 nova_compute[225701]: 2026-01-23 10:27:10.236 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 10:27:10 compute-2 nova_compute[225701]: 2026-01-23 10:27:10.236 225706 DEBUG oslo_concurrency.lockutils [req-3f3650be-0f26-4683-9b85-8b0b85d51374 req-aabc123c-b5dd-4d4b-94cc-5989201ba0f8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:27:10 compute-2 nova_compute[225701]: 2026-01-23 10:27:10.237 225706 DEBUG nova.network.neutron [req-3f3650be-0f26-4683-9b85-8b0b85d51374 req-aabc123c-b5dd-4d4b-94cc-5989201ba0f8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Refreshing network info cache for port 2611e513-4316-4421-8b89-1c0f37157967 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 10:27:10 compute-2 ceph-mon[75771]: pgmap v1018: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 13 KiB/s wr, 30 op/s
Jan 23 10:27:10 compute-2 nova_compute[225701]: 2026-01-23 10:27:10.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:27:10 compute-2 nova_compute[225701]: 2026-01-23 10:27:10.807 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:27:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:11.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:11.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:11 compute-2 nova_compute[225701]: 2026-01-23 10:27:11.509 225706 DEBUG nova.network.neutron [req-3f3650be-0f26-4683-9b85-8b0b85d51374 req-aabc123c-b5dd-4d4b-94cc-5989201ba0f8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updated VIF entry in instance network info cache for port 2611e513-4316-4421-8b89-1c0f37157967. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 10:27:11 compute-2 nova_compute[225701]: 2026-01-23 10:27:11.509 225706 DEBUG nova.network.neutron [req-3f3650be-0f26-4683-9b85-8b0b85d51374 req-aabc123c-b5dd-4d4b-94cc-5989201ba0f8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updating instance_info_cache with network_info: [{"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:27:11 compute-2 nova_compute[225701]: 2026-01-23 10:27:11.536 225706 DEBUG oslo_concurrency.lockutils [req-3f3650be-0f26-4683-9b85-8b0b85d51374 req-aabc123c-b5dd-4d4b-94cc-5989201ba0f8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:27:11 compute-2 kernel: tap2611e513-43 (unregistering): left promiscuous mode
Jan 23 10:27:11 compute-2 NetworkManager[48964]: <info>  [1769164031.6608] device (tap2611e513-43): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 10:27:11 compute-2 ovn_controller[132789]: 2026-01-23T10:27:11Z|00062|binding|INFO|Releasing lport 2611e513-4316-4421-8b89-1c0f37157967 from this chassis (sb_readonly=0)
Jan 23 10:27:11 compute-2 ovn_controller[132789]: 2026-01-23T10:27:11Z|00063|binding|INFO|Setting lport 2611e513-4316-4421-8b89-1c0f37157967 down in Southbound
Jan 23 10:27:11 compute-2 ovn_controller[132789]: 2026-01-23T10:27:11Z|00064|binding|INFO|Removing iface tap2611e513-43 ovn-installed in OVS
Jan 23 10:27:11 compute-2 nova_compute[225701]: 2026-01-23 10:27:11.670 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:11 compute-2 nova_compute[225701]: 2026-01-23 10:27:11.673 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:11 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:27:11.681 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:a8:f2 10.100.0.13'], port_security=['fa:16:3e:58:a8:f2 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-712c0ef6-fbbe-4577-b44d-9610116b414a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '8', 'neutron:security_group_ids': '1d01fb50-5068-4dfb-b608-e6e67ad89b2a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3547f5ca-ca7c-4ba0-a5f8-3ad2055eb8ec, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>], logical_port=2611e513-4316-4421-8b89-1c0f37157967) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:27:11 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:27:11.683 142606 INFO neutron.agent.ovn.metadata.agent [-] Port 2611e513-4316-4421-8b89-1c0f37157967 in datapath 712c0ef6-fbbe-4577-b44d-9610116b414a unbound from our chassis
Jan 23 10:27:11 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:27:11.685 142606 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 712c0ef6-fbbe-4577-b44d-9610116b414a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 10:27:11 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:27:11.689 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[6e03dcd2-604c-4b12-91da-b67937982d00]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:27:11 compute-2 nova_compute[225701]: 2026-01-23 10:27:11.690 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:11 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:27:11.692 142606 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a namespace which is not needed anymore
Jan 23 10:27:11 compute-2 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Jan 23 10:27:11 compute-2 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000b.scope: Consumed 15.850s CPU time.
Jan 23 10:27:11 compute-2 systemd-machined[194368]: Machine qemu-4-instance-0000000b terminated.
Jan 23 10:27:11 compute-2 nova_compute[225701]: 2026-01-23 10:27:11.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:27:11 compute-2 nova_compute[225701]: 2026-01-23 10:27:11.823 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:27:11 compute-2 nova_compute[225701]: 2026-01-23 10:27:11.824 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:27:11 compute-2 nova_compute[225701]: 2026-01-23 10:27:11.824 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:27:11 compute-2 nova_compute[225701]: 2026-01-23 10:27:11.825 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:27:11 compute-2 nova_compute[225701]: 2026-01-23 10:27:11.825 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:27:11 compute-2 nova_compute[225701]: 2026-01-23 10:27:11.846 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:11 compute-2 nova_compute[225701]: 2026-01-23 10:27:11.852 225706 INFO nova.virt.libvirt.driver [-] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Instance destroyed successfully.
Jan 23 10:27:11 compute-2 nova_compute[225701]: 2026-01-23 10:27:11.852 225706 DEBUG nova.objects.instance [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'resources' on Instance uuid 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:27:11 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:27:11 compute-2 nova_compute[225701]: 2026-01-23 10:27:11.869 225706 DEBUG nova.virt.libvirt.vif [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:25:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1129329512',display_name='tempest-TestNetworkBasicOps-server-1129329512',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1129329512',id=11,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBOLllCuGpYDHB8HQl4gVCADogEY6z7uz5xBJbTjU7iL3TTWWE5uwU0nWT40qz7D0IhyDFXlwX4fWDCogYSyOPhCdGvOGsxFut3XTWNKcRsbqCULLjO4VMFh09pWX8E0IA==',key_name='tempest-TestNetworkBasicOps-1378329290',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:26:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-6r33a8b6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:26:00Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 10:27:11 compute-2 nova_compute[225701]: 2026-01-23 10:27:11.870 225706 DEBUG nova.network.os_vif_util [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:27:11 compute-2 nova_compute[225701]: 2026-01-23 10:27:11.871 225706 DEBUG nova.network.os_vif_util [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:58:a8:f2,bridge_name='br-int',has_traffic_filtering=True,id=2611e513-4316-4421-8b89-1c0f37157967,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2611e513-43') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:27:11 compute-2 nova_compute[225701]: 2026-01-23 10:27:11.871 225706 DEBUG os_vif [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:58:a8:f2,bridge_name='br-int',has_traffic_filtering=True,id=2611e513-4316-4421-8b89-1c0f37157967,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2611e513-43') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 10:27:11 compute-2 nova_compute[225701]: 2026-01-23 10:27:11.873 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:11 compute-2 nova_compute[225701]: 2026-01-23 10:27:11.873 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2611e513-43, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:27:11 compute-2 nova_compute[225701]: 2026-01-23 10:27:11.875 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:11 compute-2 nova_compute[225701]: 2026-01-23 10:27:11.876 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:11 compute-2 nova_compute[225701]: 2026-01-23 10:27:11.880 225706 INFO os_vif [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:58:a8:f2,bridge_name='br-int',has_traffic_filtering=True,id=2611e513-4316-4421-8b89-1c0f37157967,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2611e513-43')
Jan 23 10:27:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:12 compute-2 ceph-mon[75771]: pgmap v1019: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 14 KiB/s rd, 11 KiB/s wr, 23 op/s
Jan 23 10:27:12 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/44430795' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:27:12 compute-2 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237049]: [NOTICE]   (237053) : haproxy version is 2.8.14-c23fe91
Jan 23 10:27:12 compute-2 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237049]: [NOTICE]   (237053) : path to executable is /usr/sbin/haproxy
Jan 23 10:27:12 compute-2 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237049]: [WARNING]  (237053) : Exiting Master process...
Jan 23 10:27:12 compute-2 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237049]: [ALERT]    (237053) : Current worker (237055) exited with code 143 (Terminated)
Jan 23 10:27:12 compute-2 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237049]: [WARNING]  (237053) : All workers exited. Exiting... (0)
Jan 23 10:27:12 compute-2 systemd[1]: libpod-46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3.scope: Deactivated successfully.
Jan 23 10:27:12 compute-2 podman[237511]: 2026-01-23 10:27:12.627418671 +0000 UTC m=+0.818555966 container died 46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 10:27:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:13 compute-2 ceph-mon[75771]: pgmap v1020: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 15 KiB/s rd, 11 KiB/s wr, 24 op/s
Jan 23 10:27:13 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3962373468' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:27:13 compute-2 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3-userdata-shm.mount: Deactivated successfully.
Jan 23 10:27:13 compute-2 systemd[1]: var-lib-containers-storage-overlay-455af21eacddd3d5239de182b4e4b79fd4186593d5fa50aaea2fa48c1d2e0bce-merged.mount: Deactivated successfully.
Jan 23 10:27:13 compute-2 podman[237511]: 2026-01-23 10:27:13.076589229 +0000 UTC m=+1.267726504 container cleanup 46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 10:27:13 compute-2 nova_compute[225701]: 2026-01-23 10:27:13.082 225706 DEBUG nova.compute.manager [req-cc441573-7ed5-4ded-ae98-bdb93e89d734 req-01e39a0b-7b06-4d5d-830a-594188973c24 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-vif-unplugged-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:27:13 compute-2 nova_compute[225701]: 2026-01-23 10:27:13.083 225706 DEBUG oslo_concurrency.lockutils [req-cc441573-7ed5-4ded-ae98-bdb93e89d734 req-01e39a0b-7b06-4d5d-830a-594188973c24 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:27:13 compute-2 nova_compute[225701]: 2026-01-23 10:27:13.083 225706 DEBUG oslo_concurrency.lockutils [req-cc441573-7ed5-4ded-ae98-bdb93e89d734 req-01e39a0b-7b06-4d5d-830a-594188973c24 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:27:13 compute-2 nova_compute[225701]: 2026-01-23 10:27:13.083 225706 DEBUG oslo_concurrency.lockutils [req-cc441573-7ed5-4ded-ae98-bdb93e89d734 req-01e39a0b-7b06-4d5d-830a-594188973c24 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:27:13 compute-2 nova_compute[225701]: 2026-01-23 10:27:13.084 225706 DEBUG nova.compute.manager [req-cc441573-7ed5-4ded-ae98-bdb93e89d734 req-01e39a0b-7b06-4d5d-830a-594188973c24 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] No waiting events found dispatching network-vif-unplugged-2611e513-4316-4421-8b89-1c0f37157967 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:27:13 compute-2 nova_compute[225701]: 2026-01-23 10:27:13.084 225706 DEBUG nova.compute.manager [req-cc441573-7ed5-4ded-ae98-bdb93e89d734 req-01e39a0b-7b06-4d5d-830a-594188973c24 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-vif-unplugged-2611e513-4316-4421-8b89-1c0f37157967 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 10:27:13 compute-2 systemd[1]: libpod-conmon-46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3.scope: Deactivated successfully.
Jan 23 10:27:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:13.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:13 compute-2 podman[237588]: 2026-01-23 10:27:13.33566186 +0000 UTC m=+0.231422873 container remove 46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 10:27:13 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:27:13.343 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[5225c8b6-c06d-4eb4-b647-6d819df259a2]: (4, ('Fri Jan 23 10:27:11 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a (46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3)\n46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3\nFri Jan 23 10:27:13 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a (46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3)\n46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:27:13 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:27:13.346 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[7a517479-a4c5-436a-b1e5-33258409200d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:27:13 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:27:13.348 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap712c0ef6-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:27:13 compute-2 kernel: tap712c0ef6-f0: left promiscuous mode
Jan 23 10:27:13 compute-2 nova_compute[225701]: 2026-01-23 10:27:13.381 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:13 compute-2 nova_compute[225701]: 2026-01-23 10:27:13.394 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:13 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:27:13.398 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[35561923-747d-4bda-ba5f-1a66b452588c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:27:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:13.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:13 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:27:13 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/506360951' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:27:13 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:27:13.415 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[e7af4a28-f868-4347-964c-fe1866adeaec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:27:13 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:27:13.417 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[7048755b-7537-4c47-aebd-6f557c2b2b7e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:27:13 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:27:13.434 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[a717d8d7-ce6a-421c-9993-bd79d450c71a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511263, 'reachable_time': 42014, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237610, 'error': None, 'target': 'ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:27:13 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:27:13.439 142723 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 10:27:13 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:27:13.439 142723 DEBUG oslo.privsep.daemon [-] privsep: reply[c992ed24-172b-42b7-9dd7-4d0f468839a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:27:13 compute-2 systemd[1]: run-netns-ovnmeta\x2d712c0ef6\x2dfbbe\x2d4577\x2db44d\x2d9610116b414a.mount: Deactivated successfully.
Jan 23 10:27:13 compute-2 nova_compute[225701]: 2026-01-23 10:27:13.442 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.617s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:27:13 compute-2 nova_compute[225701]: 2026-01-23 10:27:13.500 225706 DEBUG nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] skipping disk for instance-0000000b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Jan 23 10:27:13 compute-2 nova_compute[225701]: 2026-01-23 10:27:13.500 225706 DEBUG nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] skipping disk for instance-0000000b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Jan 23 10:27:13 compute-2 nova_compute[225701]: 2026-01-23 10:27:13.640 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:27:13 compute-2 nova_compute[225701]: 2026-01-23 10:27:13.642 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4872MB free_disk=59.94270324707031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:27:13 compute-2 nova_compute[225701]: 2026-01-23 10:27:13.642 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:27:13 compute-2 nova_compute[225701]: 2026-01-23 10:27:13.642 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:27:13 compute-2 nova_compute[225701]: 2026-01-23 10:27:13.721 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Instance 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 10:27:13 compute-2 nova_compute[225701]: 2026-01-23 10:27:13.721 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:27:13 compute-2 nova_compute[225701]: 2026-01-23 10:27:13.721 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:27:13 compute-2 nova_compute[225701]: 2026-01-23 10:27:13.760 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:27:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:14 compute-2 nova_compute[225701]: 2026-01-23 10:27:14.022 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:14 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1204745815' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:27:14 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1089974770' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:27:14 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/506360951' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:27:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:14 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:27:14 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3948964576' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:27:14 compute-2 nova_compute[225701]: 2026-01-23 10:27:14.294 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.534s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:27:14 compute-2 nova_compute[225701]: 2026-01-23 10:27:14.300 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:27:14 compute-2 nova_compute[225701]: 2026-01-23 10:27:14.318 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:27:14 compute-2 nova_compute[225701]: 2026-01-23 10:27:14.345 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:27:14 compute-2 nova_compute[225701]: 2026-01-23 10:27:14.346 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.704s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:27:14 compute-2 nova_compute[225701]: 2026-01-23 10:27:14.350 225706 DEBUG nova.compute.manager [req-a0e421ed-fa7e-4e8a-a2e8-d1621a9a938a req-22b5bd33-f618-43b5-ba80-35856d1fdacd 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:27:14 compute-2 nova_compute[225701]: 2026-01-23 10:27:14.350 225706 DEBUG oslo_concurrency.lockutils [req-a0e421ed-fa7e-4e8a-a2e8-d1621a9a938a req-22b5bd33-f618-43b5-ba80-35856d1fdacd 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:27:14 compute-2 nova_compute[225701]: 2026-01-23 10:27:14.351 225706 DEBUG oslo_concurrency.lockutils [req-a0e421ed-fa7e-4e8a-a2e8-d1621a9a938a req-22b5bd33-f618-43b5-ba80-35856d1fdacd 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:27:14 compute-2 nova_compute[225701]: 2026-01-23 10:27:14.351 225706 DEBUG oslo_concurrency.lockutils [req-a0e421ed-fa7e-4e8a-a2e8-d1621a9a938a req-22b5bd33-f618-43b5-ba80-35856d1fdacd 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:27:14 compute-2 nova_compute[225701]: 2026-01-23 10:27:14.351 225706 DEBUG nova.compute.manager [req-a0e421ed-fa7e-4e8a-a2e8-d1621a9a938a req-22b5bd33-f618-43b5-ba80-35856d1fdacd 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] No waiting events found dispatching network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:27:14 compute-2 nova_compute[225701]: 2026-01-23 10:27:14.351 225706 WARNING nova.compute.manager [req-a0e421ed-fa7e-4e8a-a2e8-d1621a9a938a req-22b5bd33-f618-43b5-ba80-35856d1fdacd 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received unexpected event network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 for instance with vm_state active and task_state deleting.
Jan 23 10:27:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:15 compute-2 ceph-mon[75771]: pgmap v1021: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:27:15 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3948964576' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:27:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:15.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:15.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:16 compute-2 nova_compute[225701]: 2026-01-23 10:27:16.347 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:27:16 compute-2 nova_compute[225701]: 2026-01-23 10:27:16.347 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:27:16 compute-2 nova_compute[225701]: 2026-01-23 10:27:16.347 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:27:16 compute-2 nova_compute[225701]: 2026-01-23 10:27:16.347 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:27:16 compute-2 nova_compute[225701]: 2026-01-23 10:27:16.348 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:27:16 compute-2 nova_compute[225701]: 2026-01-23 10:27:16.348 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:27:16 compute-2 nova_compute[225701]: 2026-01-23 10:27:16.877 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:17.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:27:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:17.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:27:17 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:27:17 compute-2 ceph-mon[75771]: pgmap v1022: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:27:17 compute-2 nova_compute[225701]: 2026-01-23 10:27:17.827 225706 INFO nova.virt.libvirt.driver [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Deleting instance files /var/lib/nova/instances/65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_del
Jan 23 10:27:17 compute-2 nova_compute[225701]: 2026-01-23 10:27:17.828 225706 INFO nova.virt.libvirt.driver [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Deletion of /var/lib/nova/instances/65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_del complete
Jan 23 10:27:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:17 compute-2 nova_compute[225701]: 2026-01-23 10:27:17.895 225706 INFO nova.compute.manager [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Took 7.69 seconds to destroy the instance on the hypervisor.
Jan 23 10:27:17 compute-2 nova_compute[225701]: 2026-01-23 10:27:17.895 225706 DEBUG oslo.service.loopingcall [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 10:27:17 compute-2 nova_compute[225701]: 2026-01-23 10:27:17.896 225706 DEBUG nova.compute.manager [-] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 10:27:17 compute-2 nova_compute[225701]: 2026-01-23 10:27:17.896 225706 DEBUG nova.network.neutron [-] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 10:27:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:18 compute-2 ceph-mon[75771]: pgmap v1023: 353 pgs: 353 active+clean; 41 MiB data, 268 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 852 B/s wr, 26 op/s
Jan 23 10:27:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:19 compute-2 nova_compute[225701]: 2026-01-23 10:27:19.024 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:19.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:27:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:19.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:27:19 compute-2 nova_compute[225701]: 2026-01-23 10:27:19.530 225706 DEBUG nova.network.neutron [-] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:27:19 compute-2 nova_compute[225701]: 2026-01-23 10:27:19.561 225706 INFO nova.compute.manager [-] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Took 1.67 seconds to deallocate network for instance.
Jan 23 10:27:19 compute-2 nova_compute[225701]: 2026-01-23 10:27:19.627 225706 DEBUG oslo_concurrency.lockutils [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:27:19 compute-2 nova_compute[225701]: 2026-01-23 10:27:19.628 225706 DEBUG oslo_concurrency.lockutils [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:27:19 compute-2 nova_compute[225701]: 2026-01-23 10:27:19.674 225706 DEBUG nova.compute.manager [req-90d7f807-96f1-48fc-9d1c-5762c0e654f5 req-52dcc5ab-4ace-496c-b3f8-4a158b7bad2c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-vif-deleted-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:27:19 compute-2 nova_compute[225701]: 2026-01-23 10:27:19.680 225706 DEBUG oslo_concurrency.processutils [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:27:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:20 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:27:20 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2669587480' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:27:20 compute-2 nova_compute[225701]: 2026-01-23 10:27:20.157 225706 DEBUG oslo_concurrency.processutils [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:27:20 compute-2 nova_compute[225701]: 2026-01-23 10:27:20.163 225706 DEBUG nova.compute.provider_tree [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:27:20 compute-2 nova_compute[225701]: 2026-01-23 10:27:20.180 225706 DEBUG nova.scheduler.client.report [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:27:20 compute-2 nova_compute[225701]: 2026-01-23 10:27:20.319 225706 DEBUG oslo_concurrency.lockutils [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.690s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:27:20 compute-2 sudo[237664]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:27:20 compute-2 sudo[237664]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:27:20 compute-2 sudo[237664]: pam_unix(sudo:session): session closed for user root
Jan 23 10:27:20 compute-2 nova_compute[225701]: 2026-01-23 10:27:20.341 225706 INFO nova.scheduler.client.report [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Deleted allocations for instance 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad
Jan 23 10:27:20 compute-2 sudo[237689]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:27:20 compute-2 sudo[237689]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:27:20 compute-2 nova_compute[225701]: 2026-01-23 10:27:20.457 225706 DEBUG oslo_concurrency.lockutils [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:27:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:20 compute-2 sudo[237689]: pam_unix(sudo:session): session closed for user root
Jan 23 10:27:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:27:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:21.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:27:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:21.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:21 compute-2 ceph-mon[75771]: pgmap v1024: 353 pgs: 353 active+clean; 41 MiB data, 268 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 852 B/s wr, 25 op/s
Jan 23 10:27:21 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:27:21 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2669587480' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:27:21 compute-2 nova_compute[225701]: 2026-01-23 10:27:21.879 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:22 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:27:22 compute-2 ceph-mon[75771]: pgmap v1025: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Jan 23 10:27:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:23.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:27:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:23.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:27:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:24 compute-2 nova_compute[225701]: 2026-01-23 10:27:24.026 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:24 compute-2 ceph-mon[75771]: pgmap v1026: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Jan 23 10:27:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:25.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:27:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:25.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:27:25 compute-2 nova_compute[225701]: 2026-01-23 10:27:25.681 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:25 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:27:25 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:27:25 compute-2 nova_compute[225701]: 2026-01-23 10:27:25.831 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:26 compute-2 sudo[237755]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:27:26 compute-2 sudo[237755]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:27:26 compute-2 sudo[237755]: pam_unix(sudo:session): session closed for user root
Jan 23 10:27:26 compute-2 ceph-mon[75771]: pgmap v1027: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Jan 23 10:27:26 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:27:26 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:27:26 compute-2 ceph-mon[75771]: pgmap v1028: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 1.3 KiB/s wr, 31 op/s
Jan 23 10:27:26 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:27:26 compute-2 ceph-mon[75771]: Health check update: 2 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Jan 23 10:27:26 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:27:26 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:27:26 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:27:26 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:27:26 compute-2 nova_compute[225701]: 2026-01-23 10:27:26.850 225706 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769164031.845321, 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:27:26 compute-2 nova_compute[225701]: 2026-01-23 10:27:26.851 225706 INFO nova.compute.manager [-] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] VM Stopped (Lifecycle Event)
Jan 23 10:27:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:26 compute-2 nova_compute[225701]: 2026-01-23 10:27:26.882 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:26 compute-2 nova_compute[225701]: 2026-01-23 10:27:26.887 225706 DEBUG nova.compute.manager [None req-1cefaecd-2424-412a-86d8-a1d9fb0d2b20 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:27:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:27.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:27.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:27 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:27:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:29 compute-2 nova_compute[225701]: 2026-01-23 10:27:29.028 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:29.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:29.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:29 compute-2 ceph-mon[75771]: pgmap v1029: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 383 B/s wr, 3 op/s
Jan 23 10:27:29 compute-2 podman[237783]: 2026-01-23 10:27:29.645604919 +0000 UTC m=+0.060087268 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 10:27:29 compute-2 podman[237782]: 2026-01-23 10:27:29.679825366 +0000 UTC m=+0.090173463 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller)
Jan 23 10:27:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:30 compute-2 ceph-mon[75771]: pgmap v1030: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 383 B/s wr, 3 op/s
Jan 23 10:27:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:27:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:31.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:27:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:27:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:31.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:27:31 compute-2 sudo[237828]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:27:31 compute-2 sudo[237828]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:27:31 compute-2 sudo[237828]: pam_unix(sudo:session): session closed for user root
Jan 23 10:27:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:31 compute-2 nova_compute[225701]: 2026-01-23 10:27:31.886 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:32 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:27:32 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:27:32 compute-2 ceph-mon[75771]: pgmap v1031: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 574 B/s rd, 0 op/s
Jan 23 10:27:32 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:27:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:33.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:27:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:33.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:27:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:34 compute-2 nova_compute[225701]: 2026-01-23 10:27:34.029 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:35 compute-2 ceph-mon[75771]: pgmap v1032: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 574 B/s rd, 0 op/s
Jan 23 10:27:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:27:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:27:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:35.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:27:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:35.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:36 compute-2 nova_compute[225701]: 2026-01-23 10:27:36.889 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:37 compute-2 ceph-mon[75771]: pgmap v1033: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 574 B/s rd, 0 op/s
Jan 23 10:27:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:37.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:27:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:37.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:27:37 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:27:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:38 compute-2 ceph-mon[75771]: pgmap v1034: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:27:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:39 compute-2 nova_compute[225701]: 2026-01-23 10:27:39.030 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:39.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:39.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:40 compute-2 ceph-mon[75771]: pgmap v1035: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:27:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:41.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:27:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:41.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:27:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:41 compute-2 nova_compute[225701]: 2026-01-23 10:27:41.892 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:42 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:27:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:43.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:27:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:43.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:27:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:44 compute-2 nova_compute[225701]: 2026-01-23 10:27:44.032 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:44 compute-2 ceph-mon[75771]: pgmap v1036: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:27:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:45.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:45.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:46 compute-2 ceph-mon[75771]: pgmap v1037: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:27:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:46 compute-2 sudo[237869]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:27:46 compute-2 sudo[237869]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:27:46 compute-2 sudo[237869]: pam_unix(sudo:session): session closed for user root
Jan 23 10:27:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:46 compute-2 nova_compute[225701]: 2026-01-23 10:27:46.895 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:47 compute-2 ceph-mon[75771]: pgmap v1038: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:27:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:27:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:47.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:27:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:47.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:47 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:27:47.571 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:27:47 compute-2 nova_compute[225701]: 2026-01-23 10:27:47.572 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:47 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:27:47.573 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:27:47 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:27:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:49 compute-2 nova_compute[225701]: 2026-01-23 10:27:49.076 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:49.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:49 compute-2 ceph-mon[75771]: pgmap v1039: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:27:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/2175178822' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:27:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/2175178822' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:27:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:27:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:49.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:27:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:51 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:27:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:27:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:51.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:27:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:51.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:51 compute-2 nova_compute[225701]: 2026-01-23 10:27:51.898 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:52 compute-2 ceph-mon[75771]: pgmap v1040: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:27:52 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/855952180' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:27:52 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:27:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:27:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:53.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:27:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.002000048s ======
Jan 23 10:27:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:53.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000048s
Jan 23 10:27:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:54 compute-2 nova_compute[225701]: 2026-01-23 10:27:54.079 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:54 compute-2 ceph-mon[75771]: pgmap v1041: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:27:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:55.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:27:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:55.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:27:55 compute-2 ceph-mon[75771]: pgmap v1042: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:27:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:27:55.495 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:27:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:27:55.496 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:27:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:27:55.496 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:27:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:56 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:27:56.575 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:27:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:56 compute-2 nova_compute[225701]: 2026-01-23 10:27:56.900 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:57 compute-2 ceph-mon[75771]: pgmap v1043: 353 pgs: 353 active+clean; 54 MiB data, 269 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 822 KiB/s wr, 1 op/s
Jan 23 10:27:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:57.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:57.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:58 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:27:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:59 compute-2 nova_compute[225701]: 2026-01-23 10:27:59.122 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:59.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:59 compute-2 ceph-mon[75771]: pgmap v1044: 353 pgs: 353 active+clean; 84 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 9.0 KiB/s rd, 1.7 MiB/s wr, 16 op/s
Jan 23 10:27:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:27:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:59.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:27:59 compute-2 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 23 10:27:59 compute-2 podman[237908]: 2026-01-23 10:27:59.956657268 +0000 UTC m=+0.046640585 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 23 10:27:59 compute-2 podman[237907]: 2026-01-23 10:27:59.981710838 +0000 UTC m=+0.077790335 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 10:28:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:01.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:01.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:01 compute-2 nova_compute[225701]: 2026-01-23 10:28:01.903 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:28:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:03.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:28:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:03.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:03 compute-2 ceph-mds[83039]: mds.beacon.cephfs.compute-2.prgzmm missed beacon ack from the monitors
Jan 23 10:28:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:04 compute-2 nova_compute[225701]: 2026-01-23 10:28:04.124 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:04 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:28:04 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2299921761' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:28:04 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3503106741' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:28:04 compute-2 nova_compute[225701]: 2026-01-23 10:28:04.779 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:28:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:05.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:05.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:06 compute-2 ceph-mon[75771]: pgmap v1045: 353 pgs: 353 active+clean; 84 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 8.7 KiB/s rd, 1.7 MiB/s wr, 16 op/s
Jan 23 10:28:06 compute-2 ceph-mon[75771]: pgmap v1046: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:28:06 compute-2 ceph-mon[75771]: pgmap v1047: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:28:06 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:28:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:06 compute-2 sudo[237960]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:28:06 compute-2 sudo[237960]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:28:06 compute-2 sudo[237960]: pam_unix(sudo:session): session closed for user root
Jan 23 10:28:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:06 compute-2 nova_compute[225701]: 2026-01-23 10:28:06.906 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:07.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:07 compute-2 ovn_controller[132789]: 2026-01-23T10:28:07Z|00065|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 23 10:28:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:07.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:07 compute-2 nova_compute[225701]: 2026-01-23 10:28:07.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:28:07 compute-2 nova_compute[225701]: 2026-01-23 10:28:07.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:28:07 compute-2 nova_compute[225701]: 2026-01-23 10:28:07.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:28:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:08 compute-2 nova_compute[225701]: 2026-01-23 10:28:08.060 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:28:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:08 compute-2 ceph-mon[75771]: pgmap v1048: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Jan 23 10:28:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:09 compute-2 nova_compute[225701]: 2026-01-23 10:28:09.126 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:09.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:09 compute-2 ceph-mon[75771]: pgmap v1049: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 1005 KiB/s wr, 31 op/s
Jan 23 10:28:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:28:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:09.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:28:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:10 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:28:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:10 compute-2 ceph-mon[75771]: pgmap v1050: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 12 KiB/s rd, 55 KiB/s wr, 16 op/s
Jan 23 10:28:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:11.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:11.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:11 compute-2 nova_compute[225701]: 2026-01-23 10:28:11.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:28:11 compute-2 nova_compute[225701]: 2026-01-23 10:28:11.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:28:11 compute-2 nova_compute[225701]: 2026-01-23 10:28:11.810 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:28:11 compute-2 nova_compute[225701]: 2026-01-23 10:28:11.810 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:28:11 compute-2 nova_compute[225701]: 2026-01-23 10:28:11.810 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:28:11 compute-2 nova_compute[225701]: 2026-01-23 10:28:11.810 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:28:11 compute-2 nova_compute[225701]: 2026-01-23 10:28:11.811 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:28:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:11 compute-2 nova_compute[225701]: 2026-01-23 10:28:11.908 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:12 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:28:12 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/676391413' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:28:12 compute-2 nova_compute[225701]: 2026-01-23 10:28:12.306 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:28:12 compute-2 nova_compute[225701]: 2026-01-23 10:28:12.478 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:28:12 compute-2 nova_compute[225701]: 2026-01-23 10:28:12.479 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4879MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:28:12 compute-2 nova_compute[225701]: 2026-01-23 10:28:12.479 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:28:12 compute-2 nova_compute[225701]: 2026-01-23 10:28:12.480 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:28:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:13 compute-2 nova_compute[225701]: 2026-01-23 10:28:13.129 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:28:13 compute-2 nova_compute[225701]: 2026-01-23 10:28:13.130 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:28:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:13 compute-2 nova_compute[225701]: 2026-01-23 10:28:13.189 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Refreshing inventories for resource provider db762d15-510c-4120-bfc4-afe76b90b657 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 23 10:28:13 compute-2 nova_compute[225701]: 2026-01-23 10:28:13.209 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Updating ProviderTree inventory for provider db762d15-510c-4120-bfc4-afe76b90b657 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 23 10:28:13 compute-2 nova_compute[225701]: 2026-01-23 10:28:13.210 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Updating inventory in ProviderTree for provider db762d15-510c-4120-bfc4-afe76b90b657 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 10:28:13 compute-2 nova_compute[225701]: 2026-01-23 10:28:13.227 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Refreshing aggregate associations for resource provider db762d15-510c-4120-bfc4-afe76b90b657, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 23 10:28:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:13.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:13 compute-2 nova_compute[225701]: 2026-01-23 10:28:13.246 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Refreshing trait associations for resource provider db762d15-510c-4120-bfc4-afe76b90b657, traits: COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX2,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_F16C,HW_CPU_X86_ABM,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_BMI2,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SVM,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_FDC,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_CLMUL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSSE3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 23 10:28:13 compute-2 nova_compute[225701]: 2026-01-23 10:28:13.269 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:28:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:28:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:13.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:28:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:14 compute-2 nova_compute[225701]: 2026-01-23 10:28:14.129 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:14 compute-2 ceph-mon[75771]: pgmap v1051: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 55 KiB/s wr, 85 op/s
Jan 23 10:28:14 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/676391413' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:28:14 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/615719513' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:28:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:15.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:15 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:28:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:15.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:16 compute-2 nova_compute[225701]: 2026-01-23 10:28:16.910 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:17 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3161038119' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:28:17 compute-2 ceph-mon[75771]: pgmap v1052: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 23 10:28:17 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/494566907' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:28:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:17.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:17.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:18 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:28:18 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3824772010' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:28:18 compute-2 nova_compute[225701]: 2026-01-23 10:28:18.877 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 5.607s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:28:18 compute-2 nova_compute[225701]: 2026-01-23 10:28:18.885 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:28:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:18 compute-2 nova_compute[225701]: 2026-01-23 10:28:18.907 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:28:18 compute-2 nova_compute[225701]: 2026-01-23 10:28:18.940 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:28:18 compute-2 nova_compute[225701]: 2026-01-23 10:28:18.941 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 6.461s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:28:18 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/563239400' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:28:18 compute-2 ceph-mon[75771]: pgmap v1053: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 23 10:28:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:19 compute-2 nova_compute[225701]: 2026-01-23 10:28:19.170 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:28:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:19.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:28:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:28:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:19.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:28:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:20 compute-2 ceph-mon[75771]: pgmap v1054: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 426 B/s wr, 73 op/s
Jan 23 10:28:20 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3824772010' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:28:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:20 compute-2 nova_compute[225701]: 2026-01-23 10:28:20.941 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:28:20 compute-2 nova_compute[225701]: 2026-01-23 10:28:20.942 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:28:20 compute-2 nova_compute[225701]: 2026-01-23 10:28:20.942 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:28:20 compute-2 nova_compute[225701]: 2026-01-23 10:28:20.942 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:28:20 compute-2 nova_compute[225701]: 2026-01-23 10:28:20.942 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:28:20 compute-2 nova_compute[225701]: 2026-01-23 10:28:20.942 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:28:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:21 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:28:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:21.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:21.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:21 compute-2 nova_compute[225701]: 2026-01-23 10:28:21.913 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:22 compute-2 ceph-mon[75771]: pgmap v1055: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 69 op/s
Jan 23 10:28:22 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:28:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:28:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:23.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:28:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:23.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:24 compute-2 ceph-mon[75771]: pgmap v1056: 353 pgs: 353 active+clean; 92 MiB data, 297 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 827 KiB/s wr, 83 op/s
Jan 23 10:28:24 compute-2 nova_compute[225701]: 2026-01-23 10:28:24.173 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:25.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:25.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:26 compute-2 ceph-mon[75771]: pgmap v1057: 353 pgs: 353 active+clean; 92 MiB data, 297 MiB used, 60 GiB / 60 GiB avail; 29 KiB/s rd, 827 KiB/s wr, 14 op/s
Jan 23 10:28:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:26 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:28:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:26 compute-2 nova_compute[225701]: 2026-01-23 10:28:26.915 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:26 compute-2 sudo[238050]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:28:26 compute-2 sudo[238050]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:28:26 compute-2 sudo[238050]: pam_unix(sudo:session): session closed for user root
Jan 23 10:28:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:27.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:27.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:27 compute-2 ceph-mon[75771]: pgmap v1058: 353 pgs: 353 active+clean; 95 MiB data, 297 MiB used, 60 GiB / 60 GiB avail; 30 KiB/s rd, 1000 KiB/s wr, 15 op/s
Jan 23 10:28:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:28 compute-2 ceph-mon[75771]: pgmap v1059: 353 pgs: 353 active+clean; 103 MiB data, 305 MiB used, 60 GiB / 60 GiB avail; 31 KiB/s rd, 1.6 MiB/s wr, 21 op/s
Jan 23 10:28:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:29 compute-2 nova_compute[225701]: 2026-01-23 10:28:29.221 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:29.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:28:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:29.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:28:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:30 compute-2 podman[238080]: 2026-01-23 10:28:30.627582459 +0000 UTC m=+0.052264125 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Jan 23 10:28:30 compute-2 podman[238079]: 2026-01-23 10:28:30.651822638 +0000 UTC m=+0.076956416 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 23 10:28:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:31.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:31 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:28:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:31.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:31 compute-2 ceph-mon[75771]: pgmap v1060: 353 pgs: 353 active+clean; 103 MiB data, 305 MiB used, 60 GiB / 60 GiB avail; 31 KiB/s rd, 1.6 MiB/s wr, 21 op/s
Jan 23 10:28:31 compute-2 sudo[238122]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:28:31 compute-2 sudo[238122]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:28:31 compute-2 sudo[238122]: pam_unix(sudo:session): session closed for user root
Jan 23 10:28:31 compute-2 sudo[238147]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:28:31 compute-2 sudo[238147]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:28:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:31 compute-2 nova_compute[225701]: 2026-01-23 10:28:31.917 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:32 compute-2 sudo[238147]: pam_unix(sudo:session): session closed for user root
Jan 23 10:28:32 compute-2 ceph-mon[75771]: pgmap v1061: 353 pgs: 353 active+clean; 113 MiB data, 313 MiB used, 60 GiB / 60 GiB avail; 186 KiB/s rd, 2.1 MiB/s wr, 47 op/s
Jan 23 10:28:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:33.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:33.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:33 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:28:33 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:28:33 compute-2 ceph-mon[75771]: pgmap v1062: 353 pgs: 353 active+clean; 113 MiB data, 313 MiB used, 60 GiB / 60 GiB avail; 178 KiB/s rd, 1.5 MiB/s wr, 37 op/s
Jan 23 10:28:33 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:28:33 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:28:33 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:28:33 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:28:33 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:28:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:34 compute-2 nova_compute[225701]: 2026-01-23 10:28:34.223 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:35.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:28:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:35.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:28:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:28:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:36 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:28:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:36 compute-2 ceph-mon[75771]: pgmap v1063: 353 pgs: 353 active+clean; 113 MiB data, 313 MiB used, 60 GiB / 60 GiB avail; 178 KiB/s rd, 1.5 MiB/s wr, 37 op/s
Jan 23 10:28:36 compute-2 nova_compute[225701]: 2026-01-23 10:28:36.919 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:37.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:28:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:37.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:28:37 compute-2 sudo[238209]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:28:37 compute-2 sudo[238209]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:28:37 compute-2 sudo[238209]: pam_unix(sudo:session): session closed for user root
Jan 23 10:28:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:37 compute-2 ceph-mon[75771]: pgmap v1064: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 246 KiB/s rd, 1.3 MiB/s wr, 50 op/s
Jan 23 10:28:37 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:28:37 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:28:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:39 compute-2 nova_compute[225701]: 2026-01-23 10:28:39.225 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:39.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:39 compute-2 ceph-mon[75771]: pgmap v1065: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 244 KiB/s rd, 662 KiB/s wr, 43 op/s
Jan 23 10:28:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:39.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:41.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:41 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:28:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:41.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:41 compute-2 nova_compute[225701]: 2026-01-23 10:28:41.923 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:42 compute-2 ceph-mon[75771]: pgmap v1066: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 244 KiB/s rd, 661 KiB/s wr, 43 op/s
Jan 23 10:28:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:43.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:28:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:43.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:28:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:44 compute-2 ceph-mon[75771]: pgmap v1067: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 69 KiB/s rd, 59 KiB/s wr, 15 op/s
Jan 23 10:28:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:44 compute-2 nova_compute[225701]: 2026-01-23 10:28:44.227 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:45.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:28:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:45.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:28:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:46 compute-2 ceph-mon[75771]: pgmap v1068: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 60 KiB/s rd, 52 KiB/s wr, 13 op/s
Jan 23 10:28:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:46 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:28:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:46 compute-2 nova_compute[225701]: 2026-01-23 10:28:46.925 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:47 compute-2 sudo[238244]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:28:47 compute-2 sudo[238244]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:28:47 compute-2 sudo[238244]: pam_unix(sudo:session): session closed for user root
Jan 23 10:28:47 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3848999722' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:28:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:47.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:47.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:48 compute-2 ceph-mon[75771]: pgmap v1069: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 79 KiB/s rd, 53 KiB/s wr, 41 op/s
Jan 23 10:28:48 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 23 10:28:48 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/292376593' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:28:48 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 23 10:28:48 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/292376593' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:28:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:49 compute-2 nova_compute[225701]: 2026-01-23 10:28:49.229 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:49.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:28:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:49.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:28:49 compute-2 ceph-mon[75771]: pgmap v1070: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 14 KiB/s wr, 28 op/s
Jan 23 10:28:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/292376593' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:28:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/292376593' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:28:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:28:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:50 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:28:50.976 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:28:50 compute-2 nova_compute[225701]: 2026-01-23 10:28:50.977 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:50 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:28:50.978 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:28:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:28:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:51.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:28:51 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:28:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:51.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:51 compute-2 ceph-mon[75771]: pgmap v1071: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 14 KiB/s wr, 28 op/s
Jan 23 10:28:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:51 compute-2 nova_compute[225701]: 2026-01-23 10:28:51.928 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:52 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Jan 23 10:28:52 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:52.971077) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:28:52 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Jan 23 10:28:52 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164132971399, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2385, "num_deletes": 251, "total_data_size": 6544207, "memory_usage": 6629936, "flush_reason": "Manual Compaction"}
Jan 23 10:28:52 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Jan 23 10:28:52 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:28:52.980 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:28:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164133284449, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 4222889, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31244, "largest_seqno": 33624, "table_properties": {"data_size": 4213075, "index_size": 6244, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 19896, "raw_average_key_size": 20, "raw_value_size": 4193741, "raw_average_value_size": 4305, "num_data_blocks": 264, "num_entries": 974, "num_filter_entries": 974, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163913, "oldest_key_time": 1769163913, "file_creation_time": 1769164132, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 313358 microseconds, and 14499 cpu microseconds.
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:28:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:28:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:53.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.284538) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 4222889 bytes OK
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.284574) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.308106) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.308170) EVENT_LOG_v1 {"time_micros": 1769164133308158, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.308206) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 6533777, prev total WAL file size 6533777, number of live WAL files 2.
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.310275) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(4123KB)], [60(12MB)]
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164133310439, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 16839391, "oldest_snapshot_seqno": -1}
Jan 23 10:28:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:28:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:53.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 6335 keys, 14607145 bytes, temperature: kUnknown
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164133588349, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 14607145, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14564574, "index_size": 25629, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15877, "raw_key_size": 162325, "raw_average_key_size": 25, "raw_value_size": 14450187, "raw_average_value_size": 2281, "num_data_blocks": 1024, "num_entries": 6335, "num_filter_entries": 6335, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769164133, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.588636) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 14607145 bytes
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.591525) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 60.6 rd, 52.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 12.0 +0.0 blob) out(13.9 +0.0 blob), read-write-amplify(7.4) write-amplify(3.5) OK, records in: 6853, records dropped: 518 output_compression: NoCompression
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.591550) EVENT_LOG_v1 {"time_micros": 1769164133591540, "job": 36, "event": "compaction_finished", "compaction_time_micros": 278019, "compaction_time_cpu_micros": 42860, "output_level": 6, "num_output_files": 1, "total_output_size": 14607145, "num_input_records": 6853, "num_output_records": 6335, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164133592439, "job": 36, "event": "table_file_deletion", "file_number": 62}
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164133594779, "job": 36, "event": "table_file_deletion", "file_number": 60}
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.310150) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.595013) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.595024) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.595028) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.595033) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:28:53 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.595036) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:28:53 compute-2 ceph-mon[75771]: pgmap v1072: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 14 KiB/s wr, 28 op/s
Jan 23 10:28:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:54 compute-2 nova_compute[225701]: 2026-01-23 10:28:54.231 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:28:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:55.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:28:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:28:55.496 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:28:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:28:55.496 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:28:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:28:55.496 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:28:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:28:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:55.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:28:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:56 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:28:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:56 compute-2 nova_compute[225701]: 2026-01-23 10:28:56.930 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:57 compute-2 ceph-mon[75771]: pgmap v1073: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Jan 23 10:28:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:57.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:57.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:58 compute-2 ceph-mon[75771]: pgmap v1074: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Jan 23 10:28:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:28:59 compute-2 nova_compute[225701]: 2026-01-23 10:28:59.233 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:28:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:59.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:28:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:28:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:59.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:59 compute-2 ceph-mon[75771]: pgmap v1075: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:28:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:01.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:01.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:01 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:29:01 compute-2 podman[238284]: 2026-01-23 10:29:01.652272541 +0000 UTC m=+0.063476052 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:29:01 compute-2 podman[238283]: 2026-01-23 10:29:01.712340788 +0000 UTC m=+0.126484801 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 10:29:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:01 compute-2 nova_compute[225701]: 2026-01-23 10:29:01.963 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:02 compute-2 ceph-mon[75771]: pgmap v1076: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:03.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:03.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:03 compute-2 ceph-mon[75771]: pgmap v1077: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:29:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:04 compute-2 nova_compute[225701]: 2026-01-23 10:29:04.235 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:05.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:05.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:05 compute-2 nova_compute[225701]: 2026-01-23 10:29:05.779 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:29:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:06 compute-2 ceph-mon[75771]: pgmap v1078: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:06 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:29:06 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:29:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:06 compute-2 nova_compute[225701]: 2026-01-23 10:29:06.999 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:07 compute-2 sudo[238332]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:29:07 compute-2 sudo[238332]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:29:07 compute-2 sudo[238332]: pam_unix(sudo:session): session closed for user root
Jan 23 10:29:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:07.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:07.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:07 compute-2 nova_compute[225701]: 2026-01-23 10:29:07.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:29:07 compute-2 nova_compute[225701]: 2026-01-23 10:29:07.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:29:07 compute-2 nova_compute[225701]: 2026-01-23 10:29:07.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:29:07 compute-2 nova_compute[225701]: 2026-01-23 10:29:07.798 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:29:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:08 compute-2 ceph-mon[75771]: pgmap v1079: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:29:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:09 compute-2 nova_compute[225701]: 2026-01-23 10:29:09.237 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:09.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:09.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:10 compute-2 ceph-mon[75771]: pgmap v1080: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:11.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:11.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:11 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:29:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:12 compute-2 nova_compute[225701]: 2026-01-23 10:29:12.002 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:12 compute-2 ceph-mon[75771]: pgmap v1081: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:12 compute-2 nova_compute[225701]: 2026-01-23 10:29:12.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:29:12 compute-2 nova_compute[225701]: 2026-01-23 10:29:12.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:29:12 compute-2 nova_compute[225701]: 2026-01-23 10:29:12.804 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:29:12 compute-2 nova_compute[225701]: 2026-01-23 10:29:12.804 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:29:12 compute-2 nova_compute[225701]: 2026-01-23 10:29:12.804 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:29:12 compute-2 nova_compute[225701]: 2026-01-23 10:29:12.804 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:29:12 compute-2 nova_compute[225701]: 2026-01-23 10:29:12.805 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:29:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:13.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:13 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:29:13 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3300512477' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:29:13 compute-2 nova_compute[225701]: 2026-01-23 10:29:13.369 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.565s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:29:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:13.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:13 compute-2 nova_compute[225701]: 2026-01-23 10:29:13.557 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:29:13 compute-2 nova_compute[225701]: 2026-01-23 10:29:13.558 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4920MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:29:13 compute-2 nova_compute[225701]: 2026-01-23 10:29:13.559 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:29:13 compute-2 nova_compute[225701]: 2026-01-23 10:29:13.559 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:29:13 compute-2 ceph-mon[75771]: pgmap v1082: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:29:13 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2501464116' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:29:13 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3300512477' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:29:13 compute-2 nova_compute[225701]: 2026-01-23 10:29:13.629 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:29:13 compute-2 nova_compute[225701]: 2026-01-23 10:29:13.629 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:29:13 compute-2 nova_compute[225701]: 2026-01-23 10:29:13.643 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:29:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:14 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:29:14 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/423661688' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:29:14 compute-2 nova_compute[225701]: 2026-01-23 10:29:14.062 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:29:14 compute-2 nova_compute[225701]: 2026-01-23 10:29:14.067 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:29:14 compute-2 nova_compute[225701]: 2026-01-23 10:29:14.112 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:29:14 compute-2 nova_compute[225701]: 2026-01-23 10:29:14.114 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:29:14 compute-2 nova_compute[225701]: 2026-01-23 10:29:14.114 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.555s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:29:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:14 compute-2 nova_compute[225701]: 2026-01-23 10:29:14.239 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:15 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/423661688' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:29:15 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2920842876' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:29:15 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3441369713' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:29:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:15.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:15.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:16 compute-2 nova_compute[225701]: 2026-01-23 10:29:16.114 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:29:16 compute-2 nova_compute[225701]: 2026-01-23 10:29:16.133 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:29:16 compute-2 nova_compute[225701]: 2026-01-23 10:29:16.133 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:29:16 compute-2 nova_compute[225701]: 2026-01-23 10:29:16.133 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:29:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:16 compute-2 nova_compute[225701]: 2026-01-23 10:29:16.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:29:16 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:29:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:17 compute-2 nova_compute[225701]: 2026-01-23 10:29:17.047 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:17.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:17.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:17 compute-2 ceph-mon[75771]: pgmap v1083: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:18 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1986486844' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:29:18 compute-2 ceph-mon[75771]: pgmap v1084: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:29:18 compute-2 nova_compute[225701]: 2026-01-23 10:29:18.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:29:18 compute-2 nova_compute[225701]: 2026-01-23 10:29:18.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:29:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:19 compute-2 nova_compute[225701]: 2026-01-23 10:29:19.274 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:19.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:29:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:19.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:29:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:20 compute-2 ceph-mon[75771]: pgmap v1085: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:21.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:21.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:22 compute-2 nova_compute[225701]: 2026-01-23 10:29:22.049 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:22 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:29:22 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:29:22 compute-2 ceph-mon[75771]: pgmap v1086: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:23.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:23.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:23 compute-2 ceph-mon[75771]: pgmap v1087: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:29:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:24 compute-2 nova_compute[225701]: 2026-01-23 10:29:24.277 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:25.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:25.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:26 compute-2 ceph-mon[75771]: pgmap v1088: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:27 compute-2 nova_compute[225701]: 2026-01-23 10:29:27.051 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:27 compute-2 sudo[238421]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:29:27 compute-2 sudo[238421]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:29:27 compute-2 sudo[238421]: pam_unix(sudo:session): session closed for user root
Jan 23 10:29:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:27.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:27.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:28 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:29:28 compute-2 ceph-mon[75771]: pgmap v1089: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:29:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:29 compute-2 nova_compute[225701]: 2026-01-23 10:29:29.279 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:29.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:29 compute-2 ceph-mon[75771]: pgmap v1090: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:29.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:31.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:31.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:31 compute-2 ceph-mon[75771]: pgmap v1091: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:32 compute-2 nova_compute[225701]: 2026-01-23 10:29:32.055 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:32 compute-2 podman[238453]: 2026-01-23 10:29:32.624503066 +0000 UTC m=+0.049549057 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 10:29:32 compute-2 podman[238452]: 2026-01-23 10:29:32.663019069 +0000 UTC m=+0.089378262 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 10:29:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:33 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:29:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:33.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:33.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:33 compute-2 ceph-mon[75771]: pgmap v1092: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:29:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:34 compute-2 nova_compute[225701]: 2026-01-23 10:29:34.282 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:35.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:35.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:35 compute-2 ceph-mon[75771]: pgmap v1093: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:29:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:37 compute-2 nova_compute[225701]: 2026-01-23 10:29:37.057 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:37.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:37.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:37 compute-2 sudo[238502]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:29:37 compute-2 sudo[238502]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:29:37 compute-2 sudo[238502]: pam_unix(sudo:session): session closed for user root
Jan 23 10:29:37 compute-2 sudo[238527]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:29:37 compute-2 sudo[238527]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:29:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:38 compute-2 ceph-mon[75771]: pgmap v1094: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:29:38 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:29:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:38 compute-2 sudo[238527]: pam_unix(sudo:session): session closed for user root
Jan 23 10:29:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:39 compute-2 nova_compute[225701]: 2026-01-23 10:29:39.283 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:39 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:29:39 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:29:39 compute-2 ceph-mon[75771]: pgmap v1095: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:39 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:29:39 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:29:39 compute-2 ceph-mon[75771]: pgmap v1096: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 583 B/s rd, 0 op/s
Jan 23 10:29:39 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:29:39 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:29:39 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:29:39 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:29:39 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:29:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:39.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:39.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:41.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:41.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:42 compute-2 nova_compute[225701]: 2026-01-23 10:29:42.060 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:42 compute-2 ceph-mon[75771]: pgmap v1097: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 875 B/s rd, 0 op/s
Jan 23 10:29:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:43 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:29:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:43.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:43.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:44 compute-2 ceph-mon[75771]: pgmap v1098: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 583 B/s rd, 0 op/s
Jan 23 10:29:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:44 compute-2 nova_compute[225701]: 2026-01-23 10:29:44.285 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:44 compute-2 sudo[238591]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:29:44 compute-2 sudo[238591]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:29:44 compute-2 sudo[238591]: pam_unix(sudo:session): session closed for user root
Jan 23 10:29:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:45.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:45 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:29:45 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:29:45 compute-2 ceph-mon[75771]: pgmap v1099: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 583 B/s rd, 0 op/s
Jan 23 10:29:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:45.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:47 compute-2 nova_compute[225701]: 2026-01-23 10:29:47.062 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:47 compute-2 sudo[238618]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:29:47 compute-2 sudo[238618]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:29:47 compute-2 sudo[238618]: pam_unix(sudo:session): session closed for user root
Jan 23 10:29:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:47.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:47 compute-2 ceph-mon[75771]: pgmap v1100: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 583 B/s rd, 0 op/s
Jan 23 10:29:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:47.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:48 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:29:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:48 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/2022542434' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:29:48 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/2022542434' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:29:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:49 compute-2 nova_compute[225701]: 2026-01-23 10:29:49.288 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:49.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:49.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:49 compute-2 ceph-mon[75771]: pgmap v1101: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 583 B/s rd, 0 op/s
Jan 23 10:29:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:29:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:51.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:51.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:51 compute-2 ceph-mon[75771]: pgmap v1102: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:29:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:52 compute-2 nova_compute[225701]: 2026-01-23 10:29:52.065 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:53 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:29:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:53.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:53.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:54 compute-2 ceph-mon[75771]: pgmap v1103: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:54 compute-2 nova_compute[225701]: 2026-01-23 10:29:54.288 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:55.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:29:55.498 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:29:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:29:55.499 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:29:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:29:55.499 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:29:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:55.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:56 compute-2 ceph-mon[75771]: pgmap v1104: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:56 compute-2 sshd-session[238653]: Accepted publickey for zuul from 192.168.122.10 port 58248 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 10:29:56 compute-2 systemd-logind[786]: New session 55 of user zuul.
Jan 23 10:29:56 compute-2 systemd[1]: Started Session 55 of User zuul.
Jan 23 10:29:56 compute-2 sshd-session[238653]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 10:29:57 compute-2 nova_compute[225701]: 2026-01-23 10:29:57.067 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:57 compute-2 sudo[238657]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Jan 23 10:29:57 compute-2 sudo[238657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:29:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:57.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:57.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:58 compute-2 ceph-mon[75771]: pgmap v1105: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:29:58 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:29:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:29:59 compute-2 nova_compute[225701]: 2026-01-23 10:29:59.289 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:59.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:29:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:29:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:59.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:29:59 compute-2 ceph-mon[75771]: pgmap v1106: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:00 compute-2 ceph-mon[75771]: from='client.25769 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:00 compute-2 ceph-mon[75771]: from='client.16287 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:00 compute-2 ceph-mon[75771]: Health detail: HEALTH_WARN 2 OSD(s) experiencing slow operations in BlueStore; 2 failed cephadm daemon(s)
Jan 23 10:30:00 compute-2 ceph-mon[75771]: [WRN] BLUESTORE_SLOW_OP_ALERT: 2 OSD(s) experiencing slow operations in BlueStore
Jan 23 10:30:00 compute-2 ceph-mon[75771]:      osd.1 observed slow operation indications in BlueStore
Jan 23 10:30:00 compute-2 ceph-mon[75771]:      osd.2 observed slow operation indications in BlueStore
Jan 23 10:30:00 compute-2 ceph-mon[75771]: [WRN] CEPHADM_FAILED_DAEMON: 2 failed cephadm daemon(s)
Jan 23 10:30:00 compute-2 ceph-mon[75771]:     daemon nfs.cephfs.2.0.compute-0.fenqiu on compute-0 is in error state
Jan 23 10:30:00 compute-2 ceph-mon[75771]:     daemon nfs.cephfs.1.0.compute-2.tykohi on compute-2 is in error state
Jan 23 10:30:00 compute-2 ceph-mon[75771]: from='client.25807 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:30:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:01.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:30:01 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Jan 23 10:30:01 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2989331378' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 10:30:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:30:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:01.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:30:01 compute-2 ceph-mon[75771]: from='client.25781 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:01 compute-2 ceph-mon[75771]: from='client.16296 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:01 compute-2 ceph-mon[75771]: from='client.25819 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:01 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2737233307' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 10:30:01 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1970715676' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 10:30:01 compute-2 ceph-mon[75771]: pgmap v1107: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:30:01 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2989331378' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 10:30:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:02 compute-2 nova_compute[225701]: 2026-01-23 10:30:02.070 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:03.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:03.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:03 compute-2 podman[238981]: 2026-01-23 10:30:03.628635623 +0000 UTC m=+0.053841355 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 23 10:30:03 compute-2 ceph-mon[75771]: pgmap v1108: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:03 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:30:03 compute-2 podman[238980]: 2026-01-23 10:30:03.65846024 +0000 UTC m=+0.083711283 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:30:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:04 compute-2 nova_compute[225701]: 2026-01-23 10:30:04.291 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:05.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:30:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:05.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:30:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:30:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:06 compute-2 ovs-vsctl[239080]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 23 10:30:06 compute-2 nova_compute[225701]: 2026-01-23 10:30:06.778 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:30:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:07 compute-2 ceph-mon[75771]: pgmap v1109: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:07 compute-2 nova_compute[225701]: 2026-01-23 10:30:07.073 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:30:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:07.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:30:07 compute-2 sudo[239119]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:30:07 compute-2 sudo[239119]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:30:07 compute-2 sudo[239119]: pam_unix(sudo:session): session closed for user root
Jan 23 10:30:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:30:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:07.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:30:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:08 compute-2 ceph-mon[75771]: pgmap v1110: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:30:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:08 compute-2 virtqemud[225221]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 23 10:30:08 compute-2 virtqemud[225221]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 23 10:30:08 compute-2 virtqemud[225221]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 23 10:30:08 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:30:08 compute-2 nova_compute[225701]: 2026-01-23 10:30:08.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:30:08 compute-2 nova_compute[225701]: 2026-01-23 10:30:08.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:30:08 compute-2 nova_compute[225701]: 2026-01-23 10:30:08.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:30:08 compute-2 nova_compute[225701]: 2026-01-23 10:30:08.852 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:30:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:08 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: cache status {prefix=cache status} (starting...)
Jan 23 10:30:09 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: client ls {prefix=client ls} (starting...)
Jan 23 10:30:09 compute-2 lvm[239453]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 10:30:09 compute-2 lvm[239453]: VG ceph_vg0 finished
Jan 23 10:30:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:09 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2265376060' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 10:30:09 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3709761161' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 10:30:09 compute-2 ceph-mon[75771]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 10:30:09 compute-2 nova_compute[225701]: 2026-01-23 10:30:09.294 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:09.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:09.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:09 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: damage ls {prefix=damage ls} (starting...)
Jan 23 10:30:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:09 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: dump loads {prefix=dump loads} (starting...)
Jan 23 10:30:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:10 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Jan 23 10:30:10 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1500381436' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 10:30:10 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Jan 23 10:30:10 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Jan 23 10:30:10 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Jan 23 10:30:10 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Jan 23 10:30:10 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Jan 23 10:30:10 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Jan 23 10:30:10 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/824697158' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:30:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:10 compute-2 ceph-mon[75771]: from='client.16308 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:10 compute-2 ceph-mon[75771]: from='client.25793 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:10 compute-2 ceph-mon[75771]: from='client.16320 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:10 compute-2 ceph-mon[75771]: from='client.25808 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:10 compute-2 ceph-mon[75771]: pgmap v1111: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:10 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/858010683' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:30:10 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2096205967' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:30:10 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3912976921' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 23 10:30:10 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2048205722' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 23 10:30:11 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: get subtrees {prefix=get subtrees} (starting...)
Jan 23 10:30:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:11 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: ops {prefix=ops} (starting...)
Jan 23 10:30:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:11.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:11 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Jan 23 10:30:11 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3358390859' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 23 10:30:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:11.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:11 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Jan 23 10:30:11 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2288603447' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 23 10:30:11 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 23 10:30:11 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2183856151' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:30:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:12 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: session ls {prefix=session ls} (starting...)
Jan 23 10:30:12 compute-2 nova_compute[225701]: 2026-01-23 10:30:12.109 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:12 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: status {prefix=status} (starting...)
Jan 23 10:30:12 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Jan 23 10:30:12 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2631441634' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.16332 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.25823 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.25843 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.16347 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.25841 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.25855 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1500381436' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2632100089' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1513163500' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1236970046' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/604163341' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.25867 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.25862 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.16383 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/824697158' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1085471352' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.25882 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.25883 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: pgmap v1112: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/447049587' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3358390859' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/661403047' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/621193792' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2288603447' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3873136511' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:30:12 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2183856151' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:30:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:13.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:13 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 23 10:30:13 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2510791750' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:30:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:30:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:13.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:30:13 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:30:13 compute-2 ceph-mon[75771]: from='client.16395 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:13 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2326913399' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 23 10:30:13 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1159080727' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:30:13 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2427895171' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 10:30:13 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2342214168' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:30:13 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2631441634' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 23 10:30:13 compute-2 ceph-mon[75771]: from='client.25937 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:13 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/728729314' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:30:13 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3971325804' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:30:13 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/591404194' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 23 10:30:13 compute-2 ceph-mon[75771]: pgmap v1113: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:13 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1820403701' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:30:13 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1489607522' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 23 10:30:13 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1309114209' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 23 10:30:13 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2510791750' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:30:13 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2372642057' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 23 10:30:13 compute-2 nova_compute[225701]: 2026-01-23 10:30:13.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:30:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:13 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 23 10:30:13 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1794080162' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:30:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:14 compute-2 nova_compute[225701]: 2026-01-23 10:30:14.344 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:14 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Jan 23 10:30:14 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3596812779' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 10:30:14 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Jan 23 10:30:14 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1267146393' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 23 10:30:14 compute-2 nova_compute[225701]: 2026-01-23 10:30:14.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:30:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:14 compute-2 nova_compute[225701]: 2026-01-23 10:30:14.918 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:30:14 compute-2 nova_compute[225701]: 2026-01-23 10:30:14.919 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:30:14 compute-2 nova_compute[225701]: 2026-01-23 10:30:14.919 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:30:14 compute-2 nova_compute[225701]: 2026-01-23 10:30:14.919 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:30:14 compute-2 nova_compute[225701]: 2026-01-23 10:30:14.919 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.25927 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.16452 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/261526027' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2049800932' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.25936 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/600541726' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1794080162' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/4156927190' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3699757616' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/4136571834' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2345853878' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3596812779' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1267146393' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 10:30:15 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 23 10:30:15 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3550185203' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:30:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:15.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:15 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 23 10:30:15 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/834363629' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:30:15 compute-2 nova_compute[225701]: 2026-01-23 10:30:15.536 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.617s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:30:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:30:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:15.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:30:15 compute-2 nova_compute[225701]: 2026-01-23 10:30:15.725 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:30:15 compute-2 nova_compute[225701]: 2026-01-23 10:30:15.727 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4649MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:30:15 compute-2 nova_compute[225701]: 2026-01-23 10:30:15.727 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:30:15 compute-2 nova_compute[225701]: 2026-01-23 10:30:15.727 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:30:15 compute-2 nova_compute[225701]: 2026-01-23 10:30:15.856 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:30:15 compute-2 nova_compute[225701]: 2026-01-23 10:30:15.857 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:30:15 compute-2 nova_compute[225701]: 2026-01-23 10:30:15.879 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:30:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.25985 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.16494 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.26003 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3571602394' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/4293863724' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:30:15 compute-2 ceph-mon[75771]: pgmap v1114: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3550185203' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.26027 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2261349115' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2952901814' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/834363629' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/4201206189' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/644955657' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:30:15 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1559929094' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:35.143447+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70795264 unmapped: 466944 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:36.143647+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70803456 unmapped: 458752 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:37.143836+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70803456 unmapped: 458752 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:38.144021+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70811648 unmapped: 450560 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 820182 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:39.144178+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70811648 unmapped: 450560 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:40.144402+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70811648 unmapped: 450560 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:41.144564+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70819840 unmapped: 442368 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:42.144813+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 434176 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:43.145015+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 434176 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 820182 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:44.145235+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 434176 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:45.145369+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70860800 unmapped: 401408 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:46.145538+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70877184 unmapped: 385024 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:47.145766+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70893568 unmapped: 368640 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:48.146017+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70893568 unmapped: 368640 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 71.207946777s of 71.231231689s, submitted: 2
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 821694 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:49.146266+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70893568 unmapped: 368640 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:50.146505+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70877184 unmapped: 385024 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:51.146831+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70877184 unmapped: 385024 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226affc00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:52.146986+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70909952 unmapped: 352256 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:53.147205+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70909952 unmapped: 352256 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 823206 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:54.147459+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70909952 unmapped: 352256 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:55.147629+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70926336 unmapped: 335872 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:56.147889+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70934528 unmapped: 327680 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:57.148093+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70950912 unmapped: 311296 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:58.148255+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70950912 unmapped: 311296 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 823206 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:59.148417+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70950912 unmapped: 311296 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:00.148650+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70983680 unmapped: 278528 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:01.148799+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70983680 unmapped: 278528 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:02.149050+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 262144 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:03.149279+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 262144 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:04.149505+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 823206 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 262144 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:05.149751+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 237568 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:06.150019+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 237568 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:07.150166+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71041024 unmapped: 221184 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:08.150342+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71041024 unmapped: 221184 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:09.150517+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 823206 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71041024 unmapped: 221184 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:10.150690+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71049216 unmapped: 212992 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:11.150938+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71049216 unmapped: 212992 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:12.151181+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71057408 unmapped: 204800 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:13.151417+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71065600 unmapped: 196608 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:14.151619+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 823206 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71065600 unmapped: 196608 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:15.151791+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 163840 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:16.152070+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 163840 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:17.152267+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71090176 unmapped: 172032 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:18.152519+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 163840 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:19.152682+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 823206 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 163840 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:20.152934+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 163840 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:21.153214+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71106560 unmapped: 155648 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:22.153657+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71114752 unmapped: 147456 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:23.153841+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71122944 unmapped: 139264 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:24.154069+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 823206 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71122944 unmapped: 139264 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:25.154251+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71139328 unmapped: 122880 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:26.154482+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 114688 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:27.154797+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 114688 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:28.155035+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71155712 unmapped: 106496 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:29.155259+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 823206 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71155712 unmapped: 106496 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:30.155401+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71155712 unmapped: 106496 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:31.155591+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 98304 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:32.155799+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 98304 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:33.156020+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 98304 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:34.156238+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 823206 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71172096 unmapped: 90112 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:35.156470+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71180288 unmapped: 81920 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:36.156664+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 47.972595215s of 47.983440399s, submitted: 2
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:37.156892+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 65536 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:38.157118+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 40960 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:39.157291+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 32768 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 826230 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:40.157534+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 32768 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:41.157782+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 16384 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:42.157956+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 16384 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:43.158122+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 16384 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:44.158303+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 16384 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 825048 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:45.158520+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 8192 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:46.158791+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 8192 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:47.158996+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 8192 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:48.159158+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 0 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:49.159377+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 0 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 825048 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:50.159567+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:51.159812+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:52.160028+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:53.160190+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 1015808 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:54.160409+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 1015808 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 825048 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:55.160708+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 1015808 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:56.160998+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:57.161165+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 1015808 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:58.161374+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 1015808 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:59.161524+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 1007616 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 825048 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:00.161710+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 1007616 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:01.161917+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 999424 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:02.162082+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:03.162229+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:04.162386+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 983040 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 825048 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:05.162597+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 974848 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:06.162892+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 974848 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:07.163050+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 966656 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:08.163276+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 958464 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:09.163445+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 950272 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 825048 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:10.163603+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 950272 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:11.163809+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 950272 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:12.164003+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71368704 unmapped: 942080 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:13.164140+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:14.164286+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 825048 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:15.164432+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71385088 unmapped: 925696 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:16.164695+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71385088 unmapped: 925696 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:17.164860+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 917504 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:18.165047+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71409664 unmapped: 901120 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:19.165398+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71409664 unmapped: 901120 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 825048 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:20.165696+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 884736 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x559226affc00 session 0x55922546ef00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:21.165967+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 884736 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afe400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 43.980842590s of 44.448013306s, submitted: 4
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:22.166149+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:23.166367+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:24.166573+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71442432 unmapped: 868352 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 825969 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:25.166802+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71442432 unmapped: 868352 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:26.167077+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 860160 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:27.167257+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71458816 unmapped: 851968 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:28.167466+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71458816 unmapped: 851968 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:29.167669+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71458816 unmapped: 851968 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 825969 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:30.167862+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71458816 unmapped: 851968 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:31.168071+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71467008 unmapped: 843776 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:32.168276+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71467008 unmapped: 843776 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:33.168487+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71475200 unmapped: 835584 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:34.168702+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71475200 unmapped: 835584 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 825969 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:35.168934+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71475200 unmapped: 835584 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.276217461s of 14.299237251s, submitted: 2
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:36.169153+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:37.169364+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:38.169603+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b07400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:39.169839+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828993 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:40.170059+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:41.170274+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,0,0,0,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:42.170479+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71499776 unmapped: 811008 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:43.170695+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71499776 unmapped: 811008 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:44.171289+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:45.171467+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:46.171784+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 794624 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:47.172018+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:48.172238+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:49.172454+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 778240 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:50.172631+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 778240 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:51.172842+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 778240 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:52.173049+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 770048 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:53.173256+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71548928 unmapped: 761856 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:54.173426+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:55.173629+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:56.173838+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:57.173987+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:58.174151+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:59.174313+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 720896 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:00.174463+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 720896 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:01.174653+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 720896 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:02.174819+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:03.175198+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:04.175358+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:05.175486+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 696320 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:06.175744+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 696320 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:07.175892+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 696320 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:08.176058+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 688128 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:09.176239+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 688128 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:10.176397+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:11.176592+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71639040 unmapped: 671744 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:12.176772+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:13.176924+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:14.177093+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:15.177237+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71655424 unmapped: 655360 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:16.177434+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71655424 unmapped: 655360 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:17.177595+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread fragmentation_score=0.000021 took=0.000131s
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:18.177745+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:19.177897+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:20.178115+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71671808 unmapped: 638976 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:21.178221+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71680000 unmapped: 630784 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:22.178390+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71680000 unmapped: 630784 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:23.178507+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 622592 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:24.178658+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 622592 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:25.178813+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71696384 unmapped: 614400 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:26.179001+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71696384 unmapped: 614400 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:27.179169+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71696384 unmapped: 614400 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:28.179313+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:29.179558+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:30.179704+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:31.179873+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 589824 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:32.180066+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 589824 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:33.180198+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:34.180361+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:35.180523+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:36.180760+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71745536 unmapped: 565248 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:37.181124+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71745536 unmapped: 565248 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:38.181390+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 548864 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:39.181583+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 548864 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:40.181740+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 548864 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:41.181954+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:42.182142+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:43.182440+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:44.182714+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:45.182953+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:46.183126+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:47.183324+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 524288 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:48.183514+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:49.183693+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x559226b07400 session 0x55922657e1e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:50.183838+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:51.184008+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 499712 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:52.184185+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 499712 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:53.184430+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 499712 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:54.184632+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71819264 unmapped: 491520 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:55.184809+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71819264 unmapped: 491520 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:56.185013+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 483328 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:57.185207+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71835648 unmapped: 475136 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:58.185414+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71835648 unmapped: 475136 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:59.185587+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 466944 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:00.185750+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 466944 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:01.185893+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71852032 unmapped: 458752 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:02.186073+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71852032 unmapped: 458752 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:03.186285+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71852032 unmapped: 458752 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:04.186547+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71860224 unmapped: 450560 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:05.186691+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71868416 unmapped: 442368 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:06.186912+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 434176 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:07.187092+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af5c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71884800 unmapped: 425984 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:08.187249+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71884800 unmapped: 425984 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:09.187402+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71884800 unmapped: 425984 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:10.187540+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71892992 unmapped: 417792 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:11.187688+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71892992 unmapped: 417792 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 95.813194275s of 96.227020264s, submitted: 3
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:12.187850+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71892992 unmapped: 417792 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:13.188057+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71901184 unmapped: 409600 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:14.188274+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71901184 unmapped: 409600 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 827220 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:15.188436+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 401408 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:16.188659+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71917568 unmapped: 393216 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:17.188833+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71917568 unmapped: 393216 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:18.189057+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 385024 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:19.189297+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 385024 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 827220 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:20.189441+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 385024 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:21.189596+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71933952 unmapped: 376832 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:22.189760+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71950336 unmapped: 360448 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:23.189917+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 352256 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:24.190076+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 352256 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 827220 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:25.190246+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 352256 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:26.190459+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71966720 unmapped: 344064 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:27.190614+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x559226afe400 session 0x55922677f0e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71966720 unmapped: 344064 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:28.190840+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71966720 unmapped: 344064 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:29.191240+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71974912 unmapped: 335872 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 827220 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:30.191430+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71974912 unmapped: 335872 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:31.191639+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71974912 unmapped: 335872 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Cumulative writes: 5367 writes, 23K keys, 5367 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 5367 writes, 783 syncs, 6.85 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 5367 writes, 23K keys, 5367 commit groups, 1.0 writes per commit group, ingest: 18.76 MB, 0.03 MB/s
                                           Interval WAL: 5367 writes, 783 syncs, 6.85 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 9.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 9.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 9.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 9.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 9.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 9.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 9.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb09b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb09b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb09b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 9.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 9.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:32.191835+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 270336 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:33.192011+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72048640 unmapped: 262144 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:34.192433+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72056832 unmapped: 253952 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 827220 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:35.192641+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72056832 unmapped: 253952 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:36.192924+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72056832 unmapped: 253952 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:37.193073+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 245760 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:38.193211+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 245760 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:39.193974+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72081408 unmapped: 229376 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 827220 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:40.194125+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72081408 unmapped: 229376 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:41.194245+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72081408 unmapped: 229376 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:42.194456+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72089600 unmapped: 221184 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:43.194630+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72089600 unmapped: 221184 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:44.194812+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b07000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 32.414127350s of 32.532848358s, submitted: 2
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 212992 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828732 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:45.194954+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 212992 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:46.195186+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72105984 unmapped: 204800 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:47.195367+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72114176 unmapped: 196608 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:48.195523+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72114176 unmapped: 196608 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:49.195648+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72114176 unmapped: 196608 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828732 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:50.195807+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 188416 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:51.195943+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 188416 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:52.196086+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72130560 unmapped: 180224 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:53.196233+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 172032 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:54.196457+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 172032 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828732 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:55.196669+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 163840 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:56.197029+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 163840 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:57.197190+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 163840 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:58.197381+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 155648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:59.197621+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 155648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:00.197795+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828732 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 147456 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:01.198110+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 147456 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:02.198281+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 147456 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:03.198463+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x559226af5c00 session 0x55922546e960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72171520 unmapped: 139264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.672498703s of 19.676660538s, submitted: 1
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:04.198593+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72220672 unmapped: 1138688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:05.198781+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828732 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72384512 unmapped: 974848 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:06.199005+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72392704 unmapped: 966656 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:07.199194+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 761856 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:08.199324+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,0,0,0,0,3])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72622080 unmapped: 737280 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:09.199491+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72622080 unmapped: 737280 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:10.267285+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828948 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,3])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 720896 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:11.267421+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,0,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72646656 unmapped: 712704 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:12.269582+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72646656 unmapped: 712704 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:13.269779+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72646656 unmapped: 712704 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:14.269929+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72646656 unmapped: 712704 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 5.810975552s of 11.037171364s, submitted: 201
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:15.270150+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828732 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 696320 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:16.270612+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 696320 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:17.270796+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 679936 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:18.270939+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 679936 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:19.271136+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72687616 unmapped: 671744 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:20.271277+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828732 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b00800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72695808 unmapped: 663552 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:21.271443+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72695808 unmapped: 663552 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:22.271573+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72704000 unmapped: 655360 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:23.271834+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72704000 unmapped: 655360 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:24.271977+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 647168 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:25.272164+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 630784 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:26.272379+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 630784 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:27.272553+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72744960 unmapped: 614400 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:28.272776+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72744960 unmapped: 614400 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:29.272954+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72744960 unmapped: 614400 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:30.273085+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:31.273282+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72753152 unmapped: 606208 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:32.273471+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72753152 unmapped: 606208 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:33.273644+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 598016 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:34.273825+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 598016 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:35.273976+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72769536 unmapped: 589824 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:36.274171+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72769536 unmapped: 589824 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:37.274353+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72777728 unmapped: 581632 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:38.274820+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72777728 unmapped: 581632 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:39.275049+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72777728 unmapped: 581632 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:40.275198+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 565248 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:41.275432+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 557056 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:42.275641+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72810496 unmapped: 548864 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:43.275890+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72810496 unmapped: 548864 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:44.276169+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72810496 unmapped: 548864 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:45.276379+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72818688 unmapped: 540672 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:46.276608+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72818688 unmapped: 540672 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:47.276817+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72818688 unmapped: 540672 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:48.276976+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 532480 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:49.277161+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72835072 unmapped: 524288 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:50.277398+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72851456 unmapped: 507904 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:51.277634+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72851456 unmapped: 507904 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:52.277825+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72859648 unmapped: 499712 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:53.278005+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72859648 unmapped: 499712 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:54.278295+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72867840 unmapped: 491520 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:55.278467+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 475136 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:56.278665+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 466944 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:57.278799+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 466944 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:58.278935+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 466944 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:59.279088+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 458752 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:00.279218+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 458752 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:01.279377+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 450560 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:02.279532+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 450560 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:03.279656+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 450560 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:04.279795+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72916992 unmapped: 442368 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:05.279914+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72916992 unmapped: 442368 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:06.280097+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72941568 unmapped: 417792 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:07.280252+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72941568 unmapped: 417792 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:08.280405+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72949760 unmapped: 409600 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:09.280560+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72949760 unmapped: 409600 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:10.280698+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72949760 unmapped: 409600 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:11.280815+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72957952 unmapped: 401408 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:12.281333+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72957952 unmapped: 401408 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:13.281481+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72974336 unmapped: 385024 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:14.281629+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72974336 unmapped: 385024 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:15.281805+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72982528 unmapped: 376832 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:16.282036+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72990720 unmapped: 368640 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:17.282229+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72990720 unmapped: 368640 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:18.282863+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72998912 unmapped: 360448 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x559226b08000 session 0x559224554f00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:19.283055+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72998912 unmapped: 360448 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:20.283233+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73015296 unmapped: 344064 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:21.283607+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73015296 unmapped: 344064 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:22.283902+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73015296 unmapped: 344064 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:23.284200+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73023488 unmapped: 335872 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:24.284373+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x559226b07000 session 0x559224ef0b40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73023488 unmapped: 335872 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:25.284532+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73031680 unmapped: 327680 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:26.284785+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73048064 unmapped: 311296 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:27.284945+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73048064 unmapped: 311296 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:28.285098+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73056256 unmapped: 303104 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:29.285283+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73056256 unmapped: 303104 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:30.285421+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73064448 unmapped: 294912 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:31.285575+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73064448 unmapped: 294912 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:32.285756+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73072640 unmapped: 286720 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 75.457023621s of 77.777244568s, submitted: 17
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:33.285908+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73072640 unmapped: 286720 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:34.286077+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73072640 unmapped: 286720 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:35.286245+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73072640 unmapped: 286720 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831165 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261edc00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:36.286500+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73080832 unmapped: 278528 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:37.286653+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73080832 unmapped: 278528 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:38.286833+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73089024 unmapped: 270336 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:39.286992+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73089024 unmapped: 270336 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:40.287153+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73089024 unmapped: 270336 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 832086 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:41.287398+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d7000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 245760 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:42.287526+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 245760 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:43.287692+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 245760 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:44.287799+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 245760 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:45.288053+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 237568 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835110 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:46.288235+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 221184 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:47.288428+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 221184 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.212070465s of 15.231811523s, submitted: 5
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:48.288610+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 221184 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:49.288821+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 221184 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:50.288967+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 221184 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:51.289212+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 221184 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:52.289455+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 221184 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:53.289703+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 221184 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:54.290320+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 221184 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:55.290459+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 221184 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:56.290666+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 221184 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:57.290782+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73146368 unmapped: 212992 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:58.290924+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73154560 unmapped: 204800 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:59.291084+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73154560 unmapped: 204800 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:00.291231+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73154560 unmapped: 204800 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:01.291378+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73154560 unmapped: 204800 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:02.291525+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73154560 unmapped: 204800 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:03.291659+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73154560 unmapped: 204800 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:04.291796+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73154560 unmapped: 204800 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:05.291950+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73154560 unmapped: 204800 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:06.292165+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73170944 unmapped: 188416 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:07.292296+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73170944 unmapped: 188416 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:08.292476+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73170944 unmapped: 188416 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:09.292769+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73170944 unmapped: 188416 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:10.292887+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73179136 unmapped: 180224 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:11.293029+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73179136 unmapped: 180224 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:12.293221+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73179136 unmapped: 180224 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:13.293420+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73179136 unmapped: 180224 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:14.293558+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73179136 unmapped: 180224 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:15.293718+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73179136 unmapped: 180224 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:16.293945+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73179136 unmapped: 180224 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:17.294105+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73179136 unmapped: 180224 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:18.294280+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:19.294427+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:20.294581+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:21.294883+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:22.295113+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:23.295319+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:24.295466+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:25.295617+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:26.295821+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:27.296026+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:28.296159+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:29.296309+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:30.296472+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:31.296642+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:32.296876+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:33.297058+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:34.297248+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:35.297364+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:36.297557+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:37.297747+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:38.297886+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:39.298027+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:40.298158+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:41.298288+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73195520 unmapped: 163840 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:42.298428+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73195520 unmapped: 163840 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x5592261edc00 session 0x559226f241e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:43.298574+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73195520 unmapped: 163840 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:44.298737+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73195520 unmapped: 163840 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:45.298876+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 155648 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:46.299041+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 155648 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:47.299172+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 155648 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:48.299334+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 155648 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:49.299501+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 155648 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:50.299666+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 139264 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:51.299887+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 139264 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:52.300040+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 139264 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:53.300280+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 139264 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:54.300466+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 139264 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:55.300625+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 139264 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:56.300890+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 139264 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:57.301048+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 139264 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:58.301196+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 139264 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:59.301335+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226636c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 71.693893433s of 71.732337952s, submitted: 2
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:00.301457+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835440 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:01.301618+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:02.301752+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:03.301907+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:04.302027+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:05.302163+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834258 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:06.302357+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:07.302471+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:08.302598+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:09.302750+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:10.302879+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834258 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:11.303011+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:12.303159+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:13.303339+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:14.303526+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:15.303643+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834258 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:16.303779+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:17.303949+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:18.304092+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:19.304182+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:20.304318+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834258 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:21.304476+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:22.304870+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:23.305018+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:24.305159+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:25.305316+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834258 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:26.305487+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:27.305619+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x559226b00800 session 0x559224742000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:28.305770+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:29.305925+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:30.306072+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834258 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:31.306212+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:32.306402+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:33.306573+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:34.306737+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:35.306940+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834258 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:36.307134+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:37.307301+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:38.307524+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:39.307751+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:40.307883+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73252864 unmapped: 106496 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834258 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:41.308031+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 40.526988983s of 41.823482513s, submitted: 3
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73252864 unmapped: 106496 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:42.308186+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73252864 unmapped: 106496 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:43.308348+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73252864 unmapped: 106496 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:44.308510+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73252864 unmapped: 106496 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:45.308896+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73252864 unmapped: 106496 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835770 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:46.309172+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73252864 unmapped: 106496 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:47.309486+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73261056 unmapped: 98304 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:48.309812+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73261056 unmapped: 98304 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:49.310345+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73261056 unmapped: 98304 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:50.310509+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835179 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:51.310684+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:52.310787+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:53.311080+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73285632 unmapped: 73728 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:54.311292+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73285632 unmapped: 73728 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:55.311532+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835179 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:56.311759+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:57.311889+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:58.312034+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:59.312154+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:00.312291+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835179 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:01.312420+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:02.312558+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:03.312715+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:04.312895+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:05.313055+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835179 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:06.313215+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:07.313363+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:08.313559+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:09.313714+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:10.313913+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835179 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:11.314051+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:12.314186+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:13.314323+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:14.314455+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x5592247d7000 session 0x55922657e3c0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:15.314591+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73326592 unmapped: 32768 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835179 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:16.314897+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73334784 unmapped: 24576 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:17.315099+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x559226636c00 session 0x5592267a7860
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:18.315309+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:19.315486+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:20.315692+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835179 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:21.315874+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:22.316078+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:23.316319+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:24.316528+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:25.316846+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:26.319093+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835179 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:27.319405+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:28.319549+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:29.319693+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:30.319815+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:31.319956+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835179 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af8000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:32.320088+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:33.320238+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:34.320378+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:35.320500+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:36.320706+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835179 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:37.320943+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:38.321076+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 56.421627045s of 56.530124664s, submitted: 2
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:39.321322+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:40.321478+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:41.321661+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834588 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:42.321787+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:43.321948+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:44.322109+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:45.322253+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:46.322431+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834588 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:47.322586+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:48.322760+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:49.322922+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:50.323065+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73261056 unmapped: 98304 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:51.323217+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834588 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:52.323583+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:53.323747+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:54.323898+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:55.324038+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:56.324221+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834588 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:57.324351+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:58.324491+0000)
Jan 23 10:30:16 compute-2 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:59.324615+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:00.324785+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:01.324936+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x559224eea800 session 0x559224743c20
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922549f800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834588 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:02.325081+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:03.325299+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:04.325549+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:05.325719+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:06.325950+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834588 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:07.326090+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:08.326237+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:09.326432+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:10.326603+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:11.326801+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73285632 unmapped: 73728 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834588 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:12.326950+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73293824 unmapped: 65536 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:13.327097+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73293824 unmapped: 65536 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:14.327248+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73293824 unmapped: 65536 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:15.327427+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73293824 unmapped: 65536 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:16.327650+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73293824 unmapped: 65536 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834588 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:17.327814+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73293824 unmapped: 65536 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:18.327941+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73293824 unmapped: 65536 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:19.328066+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73293824 unmapped: 65536 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:20.328205+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73293824 unmapped: 65536 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x559226af8000 session 0x559226feeb40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:21.328350+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834588 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:22.328503+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:23.328637+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:24.328778+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:25.328942+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:26.329117+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834588 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:27.329310+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:28.329503+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:29.329644+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:30.329782+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73318400 unmapped: 40960 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:31.329912+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73318400 unmapped: 40960 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834588 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:32.330044+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73318400 unmapped: 40960 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:33.330172+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73318400 unmapped: 40960 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:34.330323+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73318400 unmapped: 40960 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:35.330457+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261f1400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73318400 unmapped: 40960 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 57.254508972s of 57.258758545s, submitted: 1
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:36.330657+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73326592 unmapped: 32768 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:37.330818+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73326592 unmapped: 32768 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:38.331023+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73326592 unmapped: 32768 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:39.331190+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73326592 unmapped: 32768 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:40.331396+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73326592 unmapped: 32768 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:41.331595+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 16384 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:42.331747+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 16384 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:43.331891+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 16384 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:44.332029+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 16384 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:45.332252+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 16384 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:46.332457+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 16384 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:47.332647+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 16384 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:48.332995+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 16384 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:49.333829+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 16384 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:50.333990+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 16384 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:51.334141+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 16384 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:52.334320+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 16384 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:53.334461+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 8192 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:54.334664+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 8192 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:55.335346+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 8192 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:56.335622+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 8192 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:57.335878+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 8192 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:58.336095+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 8192 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:59.336355+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 8192 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:00.336607+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 8192 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:01.336775+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 1040384 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:02.336929+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 1040384 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:03.337095+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 1040384 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:04.337301+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 1040384 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:05.337525+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 1040384 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:06.337795+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 1040384 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:07.337933+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 1040384 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:08.338102+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 1040384 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:09.338244+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 1040384 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:10.338399+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:11.338529+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:12.338690+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:13.338807+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:14.338951+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:15.339095+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:16.339328+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:17.339467+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:18.339788+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:19.339954+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:20.340127+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:21.340363+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:22.340536+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:23.340684+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:24.340819+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:25.340934+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:26.341137+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:27.341320+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:28.341454+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:29.341691+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:30.341873+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:31.342128+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:32.342346+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:33.342489+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:34.342655+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:35.342818+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:36.343046+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:37.343200+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:38.343380+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261ec800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 63.238079071s of 63.242374420s, submitted: 1
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73383936 unmapped: 1024000 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:39.343524+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73383936 unmapped: 1024000 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:40.343659+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:41.343859+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837612 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:42.344035+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:43.344230+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:44.344391+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:45.344601+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:46.344870+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:47.345007+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:48.345210+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:49.345450+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:50.345621+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:51.345780+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:52.345935+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:53.346121+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:54.346270+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:55.346430+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:56.346627+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:57.346788+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:58.346941+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:59.347061+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:00.347209+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:01.347333+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:02.347516+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:03.347764+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:04.347900+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:05.348048+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:06.348232+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:07.348433+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:08.348572+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:09.348807+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:10.349000+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:11.349141+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:12.349298+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:13.349521+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:14.349663+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:15.349832+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:16.350076+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:17.350286+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:18.350465+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:19.350640+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:20.350849+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:21.351055+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:22.351205+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:23.351386+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:24.351641+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:25.351853+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:26.352044+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:27.352187+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:28.352527+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:29.352793+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:30.352946+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:31.353122+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:32.353300+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:33.353474+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:34.353609+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:35.353767+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:36.354039+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:37.354184+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:38.354329+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:39.354814+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:40.354989+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:41.355123+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:42.355308+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:43.355443+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:44.355596+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:45.355743+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:46.355949+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:47.356117+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:48.356430+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:49.356559+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:50.356707+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:51.356894+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:52.357120+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:53.357313+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:54.357465+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:55.357593+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:56.357817+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:57.357981+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:58.358247+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:59.358379+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:00.358614+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:01.358788+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:02.359022+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:03.359252+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:04.359463+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:05.359719+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:06.360081+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:07.360340+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:08.360546+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:09.360785+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:10.360926+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x5592261ec800 session 0x559226fee960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:11.361133+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:12.361357+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:13.361548+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:14.361756+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:15.361913+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:16.362086+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:17.362242+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:18.362384+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:19.362582+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:20.362747+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:21.362875+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:22.363052+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:23.363255+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:24.363419+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:25.363591+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:26.363853+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:27.364031+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afa000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:28.364379+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:29.364578+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:30.364811+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 110.972518921s of 111.781974792s, submitted: 2
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:31.364967+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:32.365113+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:33.365300+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:34.365472+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:35.365634+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:36.365785+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:37.365990+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:38.366159+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:39.366280+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:40.366404+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:41.366522+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:42.366669+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:43.366767+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:44.367174+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:45.367310+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:46.367528+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:47.367678+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:48.367867+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:49.368014+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:50.368174+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:51.368301+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:52.368457+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:53.368851+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:54.369037+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:55.369268+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:56.369496+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:57.369797+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:58.369986+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:59.370142+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:00.370306+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:01.370587+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:02.370807+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:03.370964+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:04.371112+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:05.371255+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:06.371526+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:07.371699+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:08.371888+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:09.372053+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:10.372206+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:11.372411+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:12.372567+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:13.372696+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:14.372875+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:15.373021+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:16.373194+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:17.373337+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:18.373535+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:19.373858+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:20.374075+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:21.374236+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:22.374390+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:23.374539+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:24.374777+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:25.374911+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:26.375062+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:27.375205+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:28.375351+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:29.375498+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:30.375650+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:31.375785+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 5807 writes, 24K keys, 5807 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s
                                           Cumulative WAL: 5807 writes, 987 syncs, 5.88 writes per sync, written: 0.02 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 440 writes, 717 keys, 440 commit groups, 1.0 writes per commit group, ingest: 0.23 MB, 0.00 MB/s
                                           Interval WAL: 440 writes, 204 syncs, 2.16 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb09b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb09b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb09b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:32.375984+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:33.376157+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:34.376349+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:35.376598+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:36.376860+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:37.377032+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:38.377277+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:39.377524+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:40.377792+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:41.377975+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:42.378156+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:43.378348+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:44.378501+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:45.378646+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:46.378882+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:47.379104+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:48.379332+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:49.379527+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:50.379697+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:51.379929+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:52.380088+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:53.380260+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:54.380408+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x5592261f1400 session 0x559226fefc20
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:55.380538+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:56.380788+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:57.380990+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:58.381168+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:59.381396+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:00.381583+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:01.381805+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:02.382018+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:03.382172+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:04.382319+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 93.923355103s of 93.927070618s, submitted: 1
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:05.382442+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74350592 unmapped: 1105920 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:06.382696+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1122304 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:07.382871+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1122304 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:08.383061+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1122304 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:09.383248+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1122304 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b04c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:10.383512+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 1097728 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:11.383663+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74366976 unmapped: 1089536 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:12.383837+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74383360 unmapped: 1073152 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 838014 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:13.384047+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75448320 unmapped: 1056768 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226636c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:14.384220+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 1048576 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:15.384411+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 1048576 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.652290344s of 11.764292717s, submitted: 230
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:16.384590+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:17.384994+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840966 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:18.385259+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:19.385485+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:20.385649+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:21.385798+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:22.385942+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:23.386086+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:24.386326+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75464704 unmapped: 2088960 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:25.386558+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75464704 unmapped: 2088960 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:26.386904+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75464704 unmapped: 2088960 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:27.387224+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:28.387541+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:29.387896+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:30.388181+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:31.388521+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:32.388798+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:33.388982+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:34.389162+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:35.389299+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:36.389488+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:37.389632+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:38.389797+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:39.389984+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:40.390219+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:41.390383+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:42.390515+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x559226afa000 session 0x5592254723c0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:43.390686+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:44.390863+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:45.390990+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:46.391191+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:47.391387+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:48.391603+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75489280 unmapped: 2064384 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:49.391785+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75489280 unmapped: 2064384 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:50.391907+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75489280 unmapped: 2064384 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:51.392099+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:52.392315+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:53.392508+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:54.392694+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:55.392773+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:56.393040+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:57.393212+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 41.259738922s of 41.266269684s, submitted: 3
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 841887 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:58.393392+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:59.393599+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:00.393799+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226728800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:01.393995+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:02.394211+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 841887 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:03.394384+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:04.394525+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:05.394648+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:06.394978+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:07.395208+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:08.395379+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:09.395582+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:10.395780+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:11.395953+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:12.396121+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:13.396297+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:14.396492+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:15.396609+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:16.396772+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:17.396930+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:18.397070+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:19.397190+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:20.397319+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:21.397471+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:22.397627+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:23.397789+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:24.397927+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:25.398053+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:26.398222+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:27.398389+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:28.398596+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:29.398826+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:30.399670+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:31.399869+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:32.400038+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:33.400203+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:34.400400+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:35.400538+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:36.400703+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:37.400917+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:38.401059+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:39.401270+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:40.401416+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:41.402515+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:42.403469+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:43.404163+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:44.405659+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:45.405855+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:46.406203+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:47.406661+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:48.407282+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:49.407510+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:50.407699+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:51.407884+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:52.408231+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:53.408676+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:54.408841+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:55.409013+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:56.409293+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:57.409529+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:58.409949+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:59.410211+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:00.410388+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:01.410654+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:02.410879+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:03.411228+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:04.411572+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:05.411791+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:06.412084+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:07.412259+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:08.412506+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:09.412775+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:10.412892+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:11.413049+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:12.413273+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:13.413476+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:14.413712+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:15.414012+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:16.414273+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:17.414435+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:18.414625+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:19.414837+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:20.414976+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:21.415179+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:22.415378+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:23.415586+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:24.415800+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:25.416036+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:26.416416+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:27.416596+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:28.416778+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:29.417101+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:30.417339+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:31.417559+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:32.417710+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:33.417859+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:34.418069+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:35.418234+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:36.418463+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:37.418613+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:38.418801+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:39.418959+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:40.419192+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:41.419440+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:42.419621+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:43.419826+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:44.419994+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:45.420148+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:46.420319+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:47.420449+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1982464 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:48.420594+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1982464 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:49.420795+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1982464 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:50.421018+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75579392 unmapped: 1974272 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:51.421186+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:52.421355+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:53.421529+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:54.421674+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:55.421858+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:56.422124+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:57.422303+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:58.422449+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:59.422613+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:00.422789+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:01.422954+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:02.423121+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:03.423295+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:04.423485+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:05.423643+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:06.423804+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:07.423974+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af8800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 handle_osd_map epochs [138,139], i have 138, src has [1,139]
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 130.189498901s of 130.569747925s, submitted: 3
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _renew_subs
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 138 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75636736 unmapped: 1916928 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:08.424117+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846191 data_alloc: 218103808 data_used: 40960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 139 handle_osd_map epochs [139,140], i have 139, src has [1,140]
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _renew_subs
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 140 handle_osd_map epochs [140,140], i have 140, src has [1,140]
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 140 heartbeat osd_stat(store_statfs(0x4fca7a000/0x0/0x4ffc00000, data 0xed7f2/0x1a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 76718080 unmapped: 835584 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:09.424270+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 16392192 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:10.424449+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _renew_subs
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 140 handle_osd_map epochs [140,141], i have 140, src has [1,141]
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 140 handle_osd_map epochs [141,141], i have 141, src has [1,141]
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 141 ms_handle_reset con 0x559226af8800 session 0x559227226780
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fba75000/0x0/0x4ffc00000, data 0x10ef970/0x11a6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 77979648 unmapped: 16359424 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559224eeb000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fba75000/0x0/0x4ffc00000, data 0x10ef970/0x11a6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:11.424623+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78102528 unmapped: 16236544 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:12.424796+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 141 handle_osd_map epochs [141,142], i have 141, src has [1,142]
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 142 ms_handle_reset con 0x559224eeb000 session 0x559227226d20
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78127104 unmapped: 16211968 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:13.424997+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967505 data_alloc: 218103808 data_used: 45056
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78127104 unmapped: 16211968 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:14.425136+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78127104 unmapped: 16211968 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:15.425336+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 142 handle_osd_map epochs [142,143], i have 142, src has [1,143]
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6d000/0x0/0x4ffc00000, data 0x10f3bc6/0x11ae000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:16.425553+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:17.425842+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:18.426486+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969935 data_alloc: 218103808 data_used: 45056
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:19.426917+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:20.427312+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:21.427599+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:22.427787+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:23.428378+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969935 data_alloc: 218103808 data_used: 45056
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:24.428906+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:25.429343+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:26.429828+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:27.430251+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226b04c00 session 0x559226feeb40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:28.430360+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969935 data_alloc: 218103808 data_used: 45056
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:29.430512+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:30.430677+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:31.430796+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:32.430988+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:33.431361+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969935 data_alloc: 218103808 data_used: 45056
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:34.431850+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:35.432102+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:36.432347+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:37.432689+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:38.433058+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226636c00 session 0x55922721e780
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969935 data_alloc: 218103808 data_used: 45056
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:39.433406+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:40.433653+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:41.433810+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:42.434150+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 33.924980164s of 34.470951080s, submitted: 51
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:43.434459+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971447 data_alloc: 218103808 data_used: 45056
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:44.434763+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226728800 session 0x55922677e960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:45.434866+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:46.435047+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:47.435246+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:48.435454+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971447 data_alloc: 218103808 data_used: 45056
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af8400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226af8400 session 0x559227227e00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e8400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x5592261e8400 session 0x55922723c5a0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226634000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226634000 session 0x55922723c780
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:49.435594+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78168064 unmapped: 16171008 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:50.435789+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afcc00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78168064 unmapped: 16171008 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226afcc00 session 0x55922723c960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afcc00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226afcc00 session 0x55922723cd20
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:51.436177+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 92913664 unmapped: 1425408 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af6400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226af6400 session 0x55922723cf00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e1000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:52.436358+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 92938240 unmapped: 1400832 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 144 handle_osd_map epochs [144,145], i have 144, src has [1,145]
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.170117378s of 10.184672356s, submitted: 3
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 145 heartbeat osd_stat(store_statfs(0x4fba67000/0x0/0x4ffc00000, data 0x10f7c84/0x11b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,7])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 145 ms_handle_reset con 0x5592261e1000 session 0x55922723d0e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:53.436587+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93839360 unmapped: 17342464 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123797 data_alloc: 234881024 data_used: 13676544
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e1800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 145 ms_handle_reset con 0x5592261e1800 session 0x55922723da40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:54.436766+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93863936 unmapped: 17317888 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:55.436938+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b02400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93855744 unmapped: 17326080 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e7c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 145 ms_handle_reset con 0x5592261e7c00 session 0x559226f24f00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:56.437185+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93937664 unmapped: 17244160 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:57.437344+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93937664 unmapped: 17244160 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e1000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 145 ms_handle_reset con 0x5592261e1000 session 0x559226f24960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e1800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 145 heartbeat osd_stat(store_statfs(0x4fac33000/0x0/0x4ffc00000, data 0x1f29dc4/0x1fe7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 145 ms_handle_reset con 0x5592261e1800 session 0x55922657f860
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:58.437552+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93700096 unmapped: 17481728 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1128704 data_alloc: 234881024 data_used: 13676544
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af6400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afcc00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:59.437817+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93724672 unmapped: 17457152 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:00.437947+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 99999744 unmapped: 11182080 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:01.438147+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105676800 unmapped: 5505024 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b08000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:02.438561+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105676800 unmapped: 5505024 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fac0c000/0x0/0x4ffc00000, data 0x1f4fda5/0x200f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e1c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.482107162s of 10.092863083s, submitted: 62
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:03.438798+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1227418 data_alloc: 234881024 data_used: 25862144
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:04.439046+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:05.439224+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fac0c000/0x0/0x4ffc00000, data 0x1f4fda5/0x200f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:06.439499+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fac0c000/0x0/0x4ffc00000, data 0x1f4fda5/0x200f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:07.439699+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:08.439939+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225987 data_alloc: 234881024 data_used: 25862144
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:09.440121+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:10.440312+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:11.440488+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109395968 unmapped: 3883008 heap: 113278976 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa78e000/0x0/0x4ffc00000, data 0x23ceda5/0x248e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:12.440668+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108773376 unmapped: 9756672 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.997505188s of 10.222607613s, submitted: 78
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9eb8000/0x0/0x4ffc00000, data 0x2ca4da5/0x2d64000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:13.440848+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111665152 unmapped: 6864896 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1356813 data_alloc: 251658240 data_used: 27123712
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:14.441064+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111968256 unmapped: 6561792 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:15.441286+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 6553600 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c6c000/0x0/0x4ffc00000, data 0x2d4fda5/0x2e0f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:16.441596+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112009216 unmapped: 6520832 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:17.441783+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112009216 unmapped: 6520832 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:18.442026+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112009216 unmapped: 6520832 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1357269 data_alloc: 251658240 data_used: 27136000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:19.442173+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112156672 unmapped: 6373376 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c49000/0x0/0x4ffc00000, data 0x2d73da5/0x2e33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:20.442397+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112156672 unmapped: 6373376 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:21.442629+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112173056 unmapped: 6356992 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:22.442851+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112173056 unmapped: 6356992 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:23.443081+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c49000/0x0/0x4ffc00000, data 0x2d73da5/0x2e33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112181248 unmapped: 6348800 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1354717 data_alloc: 251658240 data_used: 27205632
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:24.443284+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112181248 unmapped: 6348800 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:25.443498+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112181248 unmapped: 6348800 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:26.443772+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112181248 unmapped: 6348800 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.461258888s of 13.723365784s, submitted: 31
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:27.444036+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112353280 unmapped: 6176768 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c40000/0x0/0x4ffc00000, data 0x2d7cda5/0x2e3c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:28.444287+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 6553600 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1354877 data_alloc: 251658240 data_used: 27205632
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c3f000/0x0/0x4ffc00000, data 0x2d7dda5/0x2e3d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:29.444528+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 6553600 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:30.444832+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 6553600 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:31.445061+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 6553600 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:32.445277+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 6553600 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:33.445488+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111984640 unmapped: 6545408 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1354877 data_alloc: 251658240 data_used: 27205632
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:34.445696+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111984640 unmapped: 6545408 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c3f000/0x0/0x4ffc00000, data 0x2d7dda5/0x2e3d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:35.445840+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111984640 unmapped: 6545408 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:36.446144+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111984640 unmapped: 6545408 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:37.446452+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111984640 unmapped: 6545408 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:38.446667+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261ec800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ec800 session 0x55922723d4a0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af7c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af7c00 session 0x55922723cb40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e6000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e6000 session 0x55922723dc20
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111984640 unmapped: 6545408 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1354877 data_alloc: 251658240 data_used: 27205632
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:39.446896+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e1000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e1000 session 0x5592254712c0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e1800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114376704 unmapped: 4153344 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e1800 session 0x559225471a40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c3f000/0x0/0x4ffc00000, data 0x2d7dda5/0x2e3d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:40.447079+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114376704 unmapped: 4153344 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261ec800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ec800 session 0x559225624f00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af7c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.490158081s of 14.116064072s, submitted: 3
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:41.447267+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af7c00 session 0x559223b87e00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115531776 unmapped: 13500416 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:42.447492+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115531776 unmapped: 13500416 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:43.447755+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b6000/0x0/0x4ffc00000, data 0x3506da5/0x35c6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115564544 unmapped: 13467648 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1411121 data_alloc: 251658240 data_used: 29302784
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:44.447934+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115564544 unmapped: 13467648 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:45.448126+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115564544 unmapped: 13467648 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b4000/0x0/0x4ffc00000, data 0x3507da5/0x35c7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:46.448352+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115597312 unmapped: 13434880 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:47.448580+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115597312 unmapped: 13434880 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:48.448812+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115597312 unmapped: 13434880 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1411257 data_alloc: 251658240 data_used: 29302784
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:49.448970+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e2000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e2000 session 0x55922546f0e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:50.449136+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b4000/0x0/0x4ffc00000, data 0x3507da5/0x35c7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:51.449304+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:52.449497+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:53.449709+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1411409 data_alloc: 251658240 data_used: 29306880
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:54.449901+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afa400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afa400 session 0x5592247d8960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b4000/0x0/0x4ffc00000, data 0x3507da5/0x35c7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:55.450111+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:56.450323+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248f2400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f2400 session 0x55922721fa40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248f3400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.360092163s of 15.706788063s, submitted: 14
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f3400 session 0x55922721fc20
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afd400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:57.450460+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115613696 unmapped: 13418496 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922549ec00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:58.450609+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120020992 unmapped: 9011200 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1456479 data_alloc: 251658240 data_used: 33521664
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:59.450757+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 8978432 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b3000/0x0/0x4ffc00000, data 0x3507db5/0x35c8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:00.450900+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120086528 unmapped: 8945664 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:01.451038+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120086528 unmapped: 8945664 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b3000/0x0/0x4ffc00000, data 0x3507db5/0x35c8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:02.451192+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120119296 unmapped: 8912896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:03.451343+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120119296 unmapped: 8912896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1456615 data_alloc: 251658240 data_used: 33521664
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:04.451552+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120119296 unmapped: 8912896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:05.451714+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b1000/0x0/0x4ffc00000, data 0x350adb5/0x35cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120119296 unmapped: 8912896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:06.451940+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120119296 unmapped: 8912896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:07.452095+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120119296 unmapped: 8912896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:08.452251+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b1000/0x0/0x4ffc00000, data 0x350adb5/0x35cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119701504 unmapped: 9330688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.960206985s of 12.248162270s, submitted: 5
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1457327 data_alloc: 251658240 data_used: 33529856
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:09.452384+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119889920 unmapped: 9142272 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:10.452545+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120283136 unmapped: 8749056 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:11.452894+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 7888896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:12.453040+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 7888896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f80f2000/0x0/0x4ffc00000, data 0x38bbdb5/0x397c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:13.453209+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 7856128 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1500921 data_alloc: 251658240 data_used: 33931264
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:14.453412+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 7856128 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:15.455497+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 7856128 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:16.455711+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120578048 unmapped: 8454144 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:17.455901+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120578048 unmapped: 8454144 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f80f8000/0x0/0x4ffc00000, data 0x38c3db5/0x3984000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:18.456081+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f80f8000/0x0/0x4ffc00000, data 0x38c3db5/0x3984000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120578048 unmapped: 8454144 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549ec00 session 0x55922669fe00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afd400 session 0x55922721e960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f80f8000/0x0/0x4ffc00000, data 0x38c3db5/0x3984000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1495881 data_alloc: 251658240 data_used: 33931264
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248f2400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.955549240s of 10.352662086s, submitted: 63
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:19.456260+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f2400 session 0x55922723c960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:20.456427+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:21.456589+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:22.456768+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:23.456942+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c2f000/0x0/0x4ffc00000, data 0x2d8dda5/0x2e4d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1356198 data_alloc: 234881024 data_used: 25632768
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:24.457189+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:25.457535+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:26.457896+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:27.458093+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c2f000/0x0/0x4ffc00000, data 0x2d8dda5/0x2e4d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:28.458387+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1356198 data_alloc: 234881024 data_used: 25632768
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af6400 session 0x5592267a63c0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.853686333s of 10.125116348s, submitted: 14
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afcc00 session 0x5592272292c0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:29.458971+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248f2800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108109824 unmapped: 20922368 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f2800 session 0x55922723d4a0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:30.459539+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:31.460010+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:32.460178+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b08000 session 0x559227227a40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:33.460359+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051233 data_alloc: 234881024 data_used: 15777792
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:34.460581+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:35.460773+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:36.461009+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:37.461175+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:38.461342+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051233 data_alloc: 234881024 data_used: 15777792
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:39.461936+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:40.462161+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:41.462392+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:42.462553+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:43.462784+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051233 data_alloc: 234881024 data_used: 15777792
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:44.462978+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:45.463152+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:46.463355+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.126308441s of 17.207635880s, submitted: 34
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109289472 unmapped: 19742720 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:47.463506+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109289472 unmapped: 19742720 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:48.463759+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109289472 unmapped: 19742720 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1054257 data_alloc: 234881024 data_used: 15777792
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:49.463919+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261eb400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109289472 unmapped: 19742720 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:50.464087+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109289472 unmapped: 19742720 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:51.464237+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109289472 unmapped: 19742720 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:52.464448+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109297664 unmapped: 19734528 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:53.464611+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e9800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e9800 session 0x559226feed20
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922676f400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108732416 unmapped: 20299776 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1053666 data_alloc: 234881024 data_used: 14729216
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676f400 session 0x559226784d20
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:54.464782+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108339200 unmapped: 20692992 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afb000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:55.464966+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c1000/0x0/0x4ffc00000, data 0x10fbdbf/0x11bb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108339200 unmapped: 20692992 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afb000 session 0x55922721eb40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b00000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b00000 session 0x55922669f0e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:56.465233+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108707840 unmapped: 24526848 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afbc00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:57.465429+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afbc00 session 0x5592254eeb40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108707840 unmapped: 24526848 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:58.465621+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108707840 unmapped: 24526848 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e9800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e9800 session 0x55922721fa40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1132656 data_alloc: 234881024 data_used: 14729216
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:59.465818+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922676f400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676f400 session 0x55922721fc20
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afb000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.649352074s of 13.054588318s, submitted: 40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108716032 unmapped: 24518656 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9f02000/0x0/0x4ffc00000, data 0x1abadf8/0x1b7a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:00.465937+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afb000 session 0x55922721e960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109027328 unmapped: 24207360 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b00000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226634800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:01.466140+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109027328 unmapped: 24207360 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:02.466313+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9ede000/0x0/0x4ffc00000, data 0x1adedf8/0x1b9e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:03.466552+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1194396 data_alloc: 234881024 data_used: 20054016
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:04.466819+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:05.467060+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9ede000/0x0/0x4ffc00000, data 0x1adedf8/0x1b9e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:06.467350+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:07.467569+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:08.467787+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1194396 data_alloc: 234881024 data_used: 20054016
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:09.468034+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:10.468331+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9ede000/0x0/0x4ffc00000, data 0x1adedf8/0x1b9e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110280704 unmapped: 22953984 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:11.468548+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110280704 unmapped: 22953984 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:12.468867+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.911570549s of 12.916566849s, submitted: 2
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261eb400 session 0x559224554960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114982912 unmapped: 18251776 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:13.468993+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115589120 unmapped: 17645568 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1292618 data_alloc: 234881024 data_used: 21278720
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:14.469119+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b6000/0x0/0x4ffc00000, data 0x24fddf8/0x25bd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 17416192 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:15.469273+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113991680 unmapped: 19243008 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:16.469501+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113991680 unmapped: 19243008 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:17.469655+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:18.469824+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301298 data_alloc: 234881024 data_used: 21491712
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:19.469983+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:20.470141+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:21.470360+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:22.470562+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:23.470758+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301618 data_alloc: 234881024 data_used: 21499904
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:24.470950+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:25.471124+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:26.471372+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.075481415s of 14.293769836s, submitted: 87
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:27.471521+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:28.471704+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301498 data_alloc: 234881024 data_used: 21504000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:29.471883+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:30.472058+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:31.472285+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:32.472623+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113147904 unmapped: 20086784 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:33.472813+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113147904 unmapped: 20086784 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1300148 data_alloc: 234881024 data_used: 21504000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:34.472935+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113147904 unmapped: 20086784 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:35.473184+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 20004864 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:36.473458+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 20004864 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:37.473688+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 20004864 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:38.473934+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 20004864 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301820 data_alloc: 234881024 data_used: 21557248
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:39.474198+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 20004864 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:40.474377+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.004294395s of 14.015699387s, submitted: 3
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b00000 session 0x55922721ef00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 20004864 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:41.474542+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226634800 session 0x5592267a63c0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559223a27c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113238016 unmapped: 19996672 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:42.474703+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113238016 unmapped: 19996672 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:43.474880+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa16e000/0x0/0x4ffc00000, data 0x112cdf8/0x11ec000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1076098 data_alloc: 234881024 data_used: 10645504
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:44.475062+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x55922723c780
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:45.475236+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:46.475464+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:47.475664+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:48.475862+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1070046 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:49.476063+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:50.476253+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:51.476446+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:52.476620+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:53.476840+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1070046 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:54.477044+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:55.477187+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:56.477372+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:57.477569+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:58.477823+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1070046 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:59.478019+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:00.478237+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:01.478442+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:02.478590+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:03.478794+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1070046 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:04.478956+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:05.479109+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:06.479329+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:07.479505+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:08.479712+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1070046 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:09.479943+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:10.480106+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afa800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afa800 session 0x559226ffa960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922676ec00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676ec00 session 0x559226ffa780
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af6400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af6400 session 0x559226ffab40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559223a27c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x559226ffb680
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226634800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 30.319431305s of 30.379514694s, submitted: 28
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107036672 unmapped: 26198016 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:11.480248+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107077632 unmapped: 26157056 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:12.480390+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226634800 session 0x559226ffa5a0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922676e000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676e000 session 0x55922721f4a0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b06c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b06c00 session 0x55922721e5a0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e4400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e4400 session 0x55922721e960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559223a27c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x559226ffb860
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 26214400 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:13.480548+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 26214400 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:14.480691+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1117062 data_alloc: 234881024 data_used: 10539008
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3ce000/0x0/0x4ffc00000, data 0x15ede08/0x16ae000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 26214400 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:15.480935+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3ce000/0x0/0x4ffc00000, data 0x15ede08/0x16ae000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 26214400 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:16.481092+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b01c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b01c00 session 0x5592247d9860
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 26214400 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:17.481258+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af4800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4800 session 0x5592267850e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 26214400 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:18.481437+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afc400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afc400 session 0x559226784780
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b06400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107028480 unmapped: 26206208 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:19.481583+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1117924 data_alloc: 234881024 data_used: 10539008
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b06400 session 0x559226784960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559223a27c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af4800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3cd000/0x0/0x4ffc00000, data 0x15ede18/0x16af000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107036672 unmapped: 26198016 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:20.481760+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107036672 unmapped: 26198016 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:21.481925+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:22.482079+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3cd000/0x0/0x4ffc00000, data 0x15ede18/0x16af000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:23.482200+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3cd000/0x0/0x4ffc00000, data 0x15ede18/0x16af000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:24.482329+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1152408 data_alloc: 234881024 data_used: 15626240
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:25.482517+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:26.482820+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:27.482969+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3cd000/0x0/0x4ffc00000, data 0x15ede18/0x16af000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:28.483135+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:29.483275+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1152408 data_alloc: 234881024 data_used: 15626240
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:30.483449+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.764934540s of 19.582212448s, submitted: 45
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:31.483645+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:32.483821+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3cd000/0x0/0x4ffc00000, data 0x15ede18/0x16af000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,0,0,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:33.484008+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:34.484175+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1152956 data_alloc: 234881024 data_used: 15638528
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:35.484364+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency slow operation observed for kv_commit, latency = 5.380156040s
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency slow operation observed for kv_sync, latency = 5.380156517s
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency_fn slow operation observed for _txc_committed_kv, latency = 5.380500793s, txc = 0x559226356c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114221056 unmapped: 19013632 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:36.484567+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113123328 unmapped: 20111360 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:37.484763+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f963c000/0x0/0x4ffc00000, data 0x237ee18/0x2440000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,11])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114171904 unmapped: 19062784 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:38.484924+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114221056 unmapped: 19013632 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:39.485087+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f963c000/0x0/0x4ffc00000, data 0x237ee18/0x2440000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253386 data_alloc: 234881024 data_used: 16334848
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:40.485221+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 20078592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:41.485369+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 20078592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f960a000/0x0/0x4ffc00000, data 0x23b0e18/0x2472000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:42.486002+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 20078592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:43.486254+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 20078592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f960a000/0x0/0x4ffc00000, data 0x23b0e18/0x2472000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:44.486472+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 20078592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1263652 data_alloc: 234881024 data_used: 16482304
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:45.486647+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 20078592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 6.865746021s of 15.085161209s, submitted: 113
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:46.486873+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112640000 unmapped: 20594688 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:47.487020+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112640000 unmapped: 20594688 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:48.487306+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112640000 unmapped: 20594688 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9607000/0x0/0x4ffc00000, data 0x23b3e18/0x2475000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112648192 unmapped: 20586496 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:49.570995+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1261276 data_alloc: 234881024 data_used: 16486400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:50.571150+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 20578304 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:51.571353+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 20578304 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:52.571638+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 20578304 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:53.571911+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 20578304 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x23b4e18/0x2476000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:54.572101+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 20578304 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1261500 data_alloc: 234881024 data_used: 16486400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:55.572334+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112664576 unmapped: 20570112 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afc000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afc000 session 0x559224ef1a40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af9800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af9800 session 0x559224ef03c0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b02c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b02c00 session 0x55922721f680
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248f2400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f2400 session 0x55922721f4a0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af9c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x23b4e18/0x2476000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:56.572604+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112664576 unmapped: 20570112 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.364326477s of 10.769536018s, submitted: 4
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9605000/0x0/0x4ffc00000, data 0x23b4e28/0x2477000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:57.572758+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112672768 unmapped: 20561920 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:58.572944+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112672768 unmapped: 20561920 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af9c00 session 0x559227226960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248f2400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f2400 session 0x559226ffa1e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af9800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af9800 session 0x55922723d680
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afc000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:59.573079+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112902144 unmapped: 23486464 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1329277 data_alloc: 234881024 data_used: 16490496
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afc000 session 0x5592265523c0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b02c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b02c00 session 0x55922669e1e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:00.573198+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112574464 unmapped: 23814144 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922483b000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922483b000 session 0x559226785c20
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:01.573343+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112574464 unmapped: 23814144 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8e38000/0x0/0x4ffc00000, data 0x2b81e28/0x2c44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afdc00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afdc00 session 0x5592247d8d20
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:02.573620+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d4800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112574464 unmapped: 23814144 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x5592270fc5a0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922676e800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676e800 session 0x559226fef680
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8e38000/0x0/0x4ffc00000, data 0x2b81e28/0x2c44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e3800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:03.573778+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b03800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112599040 unmapped: 23789568 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:04.573918+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113762304 unmapped: 22626304 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1352731 data_alloc: 234881024 data_used: 19812352
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:05.574064+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119595008 unmapped: 16793600 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:06.574266+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119595008 unmapped: 16793600 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:07.574446+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119595008 unmapped: 16793600 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:08.574622+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119595008 unmapped: 16793600 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8e37000/0x0/0x4ffc00000, data 0x2b81e38/0x2c45000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:09.574833+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119627776 unmapped: 16760832 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1383891 data_alloc: 234881024 data_used: 24457216
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:10.575012+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119627776 unmapped: 16760832 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8e37000/0x0/0x4ffc00000, data 0x2b81e38/0x2c45000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.307135582s of 14.417451859s, submitted: 28
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:11.575130+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119652352 unmapped: 16736256 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:12.575364+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119652352 unmapped: 16736256 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:13.575510+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119652352 unmapped: 16736256 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:14.575673+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119652352 unmapped: 16736256 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1384819 data_alloc: 234881024 data_used: 24469504
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:15.575807+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8e36000/0x0/0x4ffc00000, data 0x2b81e38/0x2c45000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121552896 unmapped: 14835712 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:16.576026+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121905152 unmapped: 14483456 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:17.576190+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123551744 unmapped: 12836864 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:18.576361+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123551744 unmapped: 12836864 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8723000/0x0/0x4ffc00000, data 0x3295e38/0x3359000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:19.576601+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123592704 unmapped: 12795904 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1442267 data_alloc: 234881024 data_used: 24694784
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:20.576839+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123592704 unmapped: 12795904 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:21.576980+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123592704 unmapped: 12795904 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.862577438s of 11.139899254s, submitted: 73
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:22.577111+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123625472 unmapped: 12763136 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8704000/0x0/0x4ffc00000, data 0x32b4e38/0x3378000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:23.577286+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123658240 unmapped: 12730368 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:24.577437+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123658240 unmapped: 12730368 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1440939 data_alloc: 234881024 data_used: 24694784
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8704000/0x0/0x4ffc00000, data 0x32b4e38/0x3378000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e3800 session 0x5592272270e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:25.577565+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b03800 session 0x559226ffab40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 12722176 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d4800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8704000/0x0/0x4ffc00000, data 0x32b4e38/0x3378000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,4])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x5592255661e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:26.577864+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 18513920 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:27.578002+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 18513920 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:28.578113+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 18513920 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:29.578239+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 18513920 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1273528 data_alloc: 234881024 data_used: 16486400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x559226f252c0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4800 session 0x5592247d94a0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b01400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:30.578383+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 18513920 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:31.578568+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 18513920 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x23b4e18/0x2476000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b01400 session 0x5592263fe000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:32.578771+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:33.578949+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:34.579163+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1097099 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:35.579309+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:36.579460+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:37.579623+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:38.579816+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.934545517s of 17.322147369s, submitted: 73
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:39.579982+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1098611 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:40.580134+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:41.580629+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:42.580908+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:43.581625+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:44.581955+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1098611 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2687778899' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:45.582094+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:46.582349+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:47.582864+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:48.583006+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:49.583323+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1097728 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:50.583495+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:51.583752+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:52.583961+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:53.584132+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:54.584286+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1097728 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:55.584421+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:56.584566+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:57.584903+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:58.585064+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:59.585249+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1097728 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:00.585395+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:01.585520+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:02.585650+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559224eeb000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559224eeb000 session 0x559225566960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559223a27c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x55922657e960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d4800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x55922657f4a0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559224eeb000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559224eeb000 session 0x55922546e960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:03.585789+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af4800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 23.590827942s of 24.071311951s, submitted: 2
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:04.585972+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1099528 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 23085056 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:05.586138+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9e3a000/0x0/0x4ffc00000, data 0x1b82da6/0x1c42000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,6,11])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125083648 unmapped: 22855680 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9e3a000/0x0/0x4ffc00000, data 0x1b82da6/0x1c42000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,17])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:06.586318+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125083648 unmapped: 22855680 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:07.586481+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 29122560 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4800 session 0x55922546ef00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e1800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e1800 session 0x5592267a72c0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559223a27c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x5592267a6000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d4800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x5592263ffe00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559224eeb000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559224eeb000 session 0x5592263feb40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:08.586697+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114098176 unmapped: 33841152 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:09.586870+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1187056 data_alloc: 234881024 data_used: 10539008
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114098176 unmapped: 33841152 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:10.587007+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af4800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4800 session 0x5592263fef00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114098176 unmapped: 33841152 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:11.587145+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226632000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632000 session 0x559224742000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114098176 unmapped: 33841152 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f99a9000/0x0/0x4ffc00000, data 0x1c03da6/0x1cc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559223a27c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x559226f25680
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d4800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:12.587282+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 33513472 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x559226f25e00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:13.587429+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afd000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e8400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 33513472 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9984000/0x0/0x4ffc00000, data 0x1c27dc9/0x1ce8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:14.587560+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1233177 data_alloc: 234881024 data_used: 16625664
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114819072 unmapped: 33120256 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:15.587749+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:16.587901+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:17.588031+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:18.588160+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9984000/0x0/0x4ffc00000, data 0x1c27dc9/0x1ce8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:19.588282+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268441 data_alloc: 234881024 data_used: 21921792
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:20.588426+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9984000/0x0/0x4ffc00000, data 0x1c27dc9/0x1ce8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:21.588547+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:22.588688+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e1c00 session 0x55922546e5a0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:23.588875+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:24.589051+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268897 data_alloc: 234881024 data_used: 21934080
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9984000/0x0/0x4ffc00000, data 0x1c27dc9/0x1ce8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.883409500s of 21.952882767s, submitted: 22
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:25.589195+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126459904 unmapped: 21479424 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:26.589371+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c8d000/0x0/0x4ffc00000, data 0x2916dc9/0x29d7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,5])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 124387328 unmapped: 23552000 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:27.589580+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c6a000/0x0/0x4ffc00000, data 0x2939dc9/0x29fa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 124616704 unmapped: 23322624 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:28.589801+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:29.589961+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1372079 data_alloc: 234881024 data_used: 22802432
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:30.590091+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:31.590292+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.1 total, 600.0 interval
                                           Cumulative writes: 7795 writes, 32K keys, 7795 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s
                                           Cumulative WAL: 7795 writes, 1759 syncs, 4.43 writes per sync, written: 0.03 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1988 writes, 7632 keys, 1988 commit groups, 1.0 writes per commit group, ingest: 8.26 MB, 0.01 MB/s
                                           Interval WAL: 1988 writes, 772 syncs, 2.58 writes per sync, written: 0.01 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:32.590469+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:33.590608+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:34.590818+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1372095 data_alloc: 234881024 data_used: 22802432
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:35.590957+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 25436160 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:36.591171+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 25436160 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:37.591325+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 25436160 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:38.591482+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 25436160 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:39.591634+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1372095 data_alloc: 234881024 data_used: 22802432
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122511360 unmapped: 25427968 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:40.591831+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b05400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122511360 unmapped: 25427968 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:41.591963+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122511360 unmapped: 25427968 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:42.592113+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122511360 unmapped: 25427968 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:43.592309+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122519552 unmapped: 25419776 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:44.592510+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1372247 data_alloc: 234881024 data_used: 22806528
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122519552 unmapped: 25419776 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:45.592657+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122519552 unmapped: 25419776 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:46.592841+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.538951874s of 21.155471802s, submitted: 103
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122519552 unmapped: 25419776 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:47.593101+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122519552 unmapped: 25419776 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:48.593284+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122527744 unmapped: 25411584 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:49.593477+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1370448 data_alloc: 234881024 data_used: 22806528
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122527744 unmapped: 25411584 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:50.593596+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122527744 unmapped: 25411584 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:51.593786+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122527744 unmapped: 25411584 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:52.593958+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122527744 unmapped: 25411584 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:53.594119+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248f1400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,4,0,6])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 141074432 unmapped: 15261696 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f1400 session 0x559225017c20
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:54.594261+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1500262 data_alloc: 234881024 data_used: 22806528
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122871808 unmapped: 33464320 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:55.594397+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122871808 unmapped: 33464320 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:56.594586+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f79ef000/0x0/0x4ffc00000, data 0x3bbcdc9/0x3c7d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122871808 unmapped: 33464320 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b08c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b08c00 session 0x559225473c20
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:57.594795+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f79ef000/0x0/0x4ffc00000, data 0x3bbcdc9/0x3c7d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122871808 unmapped: 33464320 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b02800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b02800 session 0x55922721ef00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:58.594970+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.648444176s of 12.136721611s, submitted: 17
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b03000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b03000 session 0x559226552960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af4400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4400 session 0x55922546fc20
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122544128 unmapped: 33792000 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:59.595115+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b09000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248ef000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1502946 data_alloc: 234881024 data_used: 22810624
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122544128 unmapped: 33792000 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:00.595263+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135053312 unmapped: 21282816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549f800 session 0x559225470b40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b06000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f79cb000/0x0/0x4ffc00000, data 0x3be0dc9/0x3ca1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:01.595383+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139042816 unmapped: 17293312 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:02.595584+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139042816 unmapped: 17293312 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:03.595761+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139042816 unmapped: 17293312 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:04.595929+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1625513 data_alloc: 251658240 data_used: 41050112
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139075584 unmapped: 17260544 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:05.596156+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139149312 unmapped: 17186816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:06.596445+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139288576 unmapped: 17047552 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f79cb000/0x0/0x4ffc00000, data 0x3be0dc9/0x3ca1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:07.596709+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139337728 unmapped: 16998400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:08.596947+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139337728 unmapped: 16998400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:09.597132+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f79cb000/0x0/0x4ffc00000, data 0x3be0dc9/0x3ca1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1625426 data_alloc: 251658240 data_used: 41050112
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139337728 unmapped: 16998400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:10.597299+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139460608 unmapped: 16875520 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.569581985s of 12.389707565s, submitted: 232
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:11.597497+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142147584 unmapped: 14188544 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:12.597655+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f7566000/0x0/0x4ffc00000, data 0x4044dc9/0x4105000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142508032 unmapped: 13828096 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:13.597798+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142508032 unmapped: 13828096 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:14.597927+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1678578 data_alloc: 251658240 data_used: 42459136
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142508032 unmapped: 13828096 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:15.598096+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b05400 session 0x5592250174a0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142508032 unmapped: 13828096 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:16.598337+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142508032 unmapped: 13828096 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f7562000/0x0/0x4ffc00000, data 0x4048dc9/0x4109000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:17.598523+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142508032 unmapped: 13828096 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:18.598710+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142516224 unmapped: 13819904 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:19.604273+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1678578 data_alloc: 251658240 data_used: 42459136
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142516224 unmapped: 13819904 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:20.604471+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f7562000/0x0/0x4ffc00000, data 0x4048dc9/0x4109000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142524416 unmapped: 13811712 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:21.604609+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142524416 unmapped: 13811712 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:22.604766+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142524416 unmapped: 13811712 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:23.605050+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142532608 unmapped: 13803520 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:24.605230+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1678578 data_alloc: 251658240 data_used: 42459136
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142532608 unmapped: 13803520 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:25.605425+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f7562000/0x0/0x4ffc00000, data 0x4048dc9/0x4109000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142532608 unmapped: 13803520 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b09000 session 0x55922669e780
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248ef000 session 0x559225566960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:26.605599+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af4400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.538358688s of 15.629971504s, submitted: 40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4400 session 0x5592255ff0e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129097728 unmapped: 27238400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:27.605844+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129097728 unmapped: 27238400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:28.606031+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129097728 unmapped: 27238400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:29.606228+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1381072 data_alloc: 234881024 data_used: 22806528
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129097728 unmapped: 27238400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:30.606398+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129097728 unmapped: 27238400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:31.606559+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129097728 unmapped: 27238400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:32.606821+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e4800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:33.607076+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:34.607445+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1382584 data_alloc: 234881024 data_used: 22806528
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:35.607654+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:36.607920+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:37.608174+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:38.608470+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:39.608602+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1382584 data_alloc: 234881024 data_used: 22806528
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:40.608860+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afd000 session 0x559226f24960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e8400 session 0x55922669e3c0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226636c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.595676422s of 14.635678291s, submitted: 13
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:41.609055+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226636c00 session 0x559226553e00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:42.609220+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:43.609430+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:44.609627+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123508 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:45.609774+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:46.609998+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:47.610142+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:48.610615+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:49.610940+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123508 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:50.611062+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:51.611354+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:52.611698+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:53.612034+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:54.612185+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123508 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:55.612346+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:56.612654+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:57.612975+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:58.613123+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b02400 session 0x55922723d860
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:59.613368+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123508 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:00.613628+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:02.109471+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:03.109819+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:04.110070+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:05.110265+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123508 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:06.110462+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:07.110850+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:08.111101+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:09.111375+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:10.111622+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123508 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:11.111910+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:12.112149+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261ecc00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ecc00 session 0x5592267a63c0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248f3400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f3400 session 0x55922721e5a0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261ec800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ec800 session 0x55922721fa40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226728800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226728800 session 0x559225016780
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261eb000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 30.576278687s of 31.081003189s, submitted: 24
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261eb000 session 0x559226ffb860
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922549dc00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x5592263fe1e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248f1800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f1800 session 0x559226f25680
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261edc00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261edc00 session 0x559226f250e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d4400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4400 session 0x559226f25860
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:13.112349+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 37027840 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:14.112670+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 37027840 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:15.112843+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248f1800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f1800 session 0x559225471860
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1194678 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922549dc00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119324672 unmapped: 37011456 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261eb000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261eb000 session 0x559225472b40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:16.113047+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261edc00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261edc00 session 0x559226f252c0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afa800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afa800 session 0x559226f241e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119480320 unmapped: 36855808 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9c40000/0x0/0x4ffc00000, data 0x196cdf8/0x1a2c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e1000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af4800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:17.113271+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119496704 unmapped: 36839424 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:18.113395+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119496704 unmapped: 36839424 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:19.113569+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9c1b000/0x0/0x4ffc00000, data 0x1990e08/0x1a51000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119955456 unmapped: 36380672 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:20.113699+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1260962 data_alloc: 234881024 data_used: 19304448
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119955456 unmapped: 36380672 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:21.114824+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119955456 unmapped: 36380672 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:22.115267+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119808000 unmapped: 36528128 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.900426865s of 10.097883224s, submitted: 55
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e1000 session 0x559227226b40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4800 session 0x559224742b40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248f1800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:23.115824+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f1800 session 0x55922669e780
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9c1b000/0x0/0x4ffc00000, data 0x1990e08/0x1a51000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:24.116330+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:25.116811+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1134230 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:26.117053+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:27.117505+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:28.117751+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:29.118212+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:30.118621+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1134230 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:31.118955+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:32.119200+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:33.119405+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:34.119772+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:35.120131+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1134230 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:36.120273+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922676e800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676e800 session 0x559226785a40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d4800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x5592250174a0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559224eea400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559224eea400 session 0x559225017680
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d4800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x559225016000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248f1800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.801321030s of 13.943515778s, submitted: 49
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:37.120431+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,15])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f1800 session 0x559225017c20
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922676e800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676e800 session 0x5592254705a0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af4800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4800 session 0x559225473e00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248f1c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f1c00 session 0x55922721eb40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d4800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x55922721f0e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116989952 unmapped: 47742976 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:38.120551+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116989952 unmapped: 47742976 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:39.120823+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116989952 unmapped: 47742976 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:40.121029+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f937a000/0x0/0x4ffc00000, data 0x2232da6/0x22f2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1264351 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116989952 unmapped: 47742976 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:41.121324+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e4800 session 0x559224ef01e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116989952 unmapped: 47742976 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f937a000/0x0/0x4ffc00000, data 0x2232da6/0x22f2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:42.121479+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b02c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b02c00 session 0x5592255661e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117309440 unmapped: 47423488 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:43.121639+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922549f000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922676e400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117309440 unmapped: 47423488 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:44.121904+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9355000/0x0/0x4ffc00000, data 0x2256dc9/0x2317000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117309440 unmapped: 47423488 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:45.122056+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9355000/0x0/0x4ffc00000, data 0x2256dc9/0x2317000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1303380 data_alloc: 234881024 data_used: 15511552
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 43442176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:46.122276+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125919232 unmapped: 38813696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9355000/0x0/0x4ffc00000, data 0x2256dc9/0x2317000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:47.122543+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125952000 unmapped: 38780928 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:48.122779+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.294668198s of 11.932563782s, submitted: 37
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549f000 session 0x55922657fe00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676e400 session 0x5592272270e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125952000 unmapped: 38780928 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d4800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:49.123036+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118267904 unmapped: 46465024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:50.123280+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x559226552b40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1146182 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 46399488 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: mgrc ms_handle_reset ms_handle_reset con 0x5592249d8000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/4198923246
Jan 23 10:30:16 compute-2 ceph-osd[81231]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/4198923246,v1:192.168.122.100:6801/4198923246]
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: get_auth_request con 0x55922676e400 auth_method 0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: mgrc handle_mgr_configure stats_period=5
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:51.123497+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:52.123680+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:53.123870+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:54.124887+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x559225471a40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:55.125554+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1146182 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:56.126814+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:57.127419+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:58.128372+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:59.128971+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:00.129267+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1146182 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:01.129707+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:02.130136+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:03.130478+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:04.130619+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:05.130923+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1146182 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:06.131269+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:07.131577+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:08.131930+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:09.132190+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:10.132392+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1146182 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:11.132581+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261f0000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.419353485s of 23.247339249s, submitted: 36
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:12.132719+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:13.132912+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:14.133099+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:15.133231+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147694 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:16.133478+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:17.133867+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:18.134252+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:19.134486+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:20.134771+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147694 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:21.134887+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:22.135090+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:23.135298+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:24.135561+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:25.135763+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147694 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:26.135910+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:27.136089+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226f23000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226f23000 session 0x559226784f00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559227766000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559227766000 session 0x559226785c20
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b00800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b00800 session 0x5592254725a0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d4800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x5592254730e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922549dc00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.210437775s of 15.619210243s, submitted: 1
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x5592263ff2c0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:28.136264+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118603776 unmapped: 46129152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:29.136405+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118603776 unmapped: 46129152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b28000/0x0/0x4ffc00000, data 0x1a85d96/0x1b44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:30.136582+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118603776 unmapped: 46129152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220376 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afb400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afb400 session 0x5592270fd860
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:31.136716+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118603776 unmapped: 46129152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b28000/0x0/0x4ffc00000, data 0x1a85d96/0x1b44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:32.136953+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118603776 unmapped: 46129152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226632400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632400 session 0x5592270fc960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d7400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d7400 session 0x559225567c20
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d4800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:33.137153+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118620160 unmapped: 46112768 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:34.137397+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118620160 unmapped: 46112768 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x559226ffbc20
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922549dc00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226632400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:35.137631+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118652928 unmapped: 46080000 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1221149 data_alloc: 234881024 data_used: 10539008
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:36.137802+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118652928 unmapped: 46080000 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b28000/0x0/0x4ffc00000, data 0x1a85d96/0x1b44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:37.137987+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118652928 unmapped: 46080000 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b28000/0x0/0x4ffc00000, data 0x1a85d96/0x1b44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:38.138161+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:39.138323+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:40.138478+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b28000/0x0/0x4ffc00000, data 0x1a85d96/0x1b44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279061 data_alloc: 234881024 data_used: 19120128
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:41.138676+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:42.138818+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:43.138959+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:44.139159+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:45.139335+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b28000/0x0/0x4ffc00000, data 0x1a85d96/0x1b44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279061 data_alloc: 234881024 data_used: 19120128
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:46.139474+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:47.139695+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:48.139835+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.080177307s of 20.722246170s, submitted: 23
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 43720704 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9265000/0x0/0x4ffc00000, data 0x2340d96/0x23ff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:49.140030+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126017536 unmapped: 38715392 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:50.140186+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126550016 unmapped: 38182912 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1390845 data_alloc: 234881024 data_used: 19333120
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:51.140268+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126550016 unmapped: 38182912 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8ec0000/0x0/0x4ffc00000, data 0x26e4d96/0x27a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:52.140377+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126566400 unmapped: 38166528 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:53.140574+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 39575552 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8ec9000/0x0/0x4ffc00000, data 0x26e4d96/0x27a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:54.140712+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 39542784 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:55.140931+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 39542784 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8ec9000/0x0/0x4ffc00000, data 0x26e4d96/0x27a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1382581 data_alloc: 234881024 data_used: 19349504
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:56.141098+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 39542784 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:57.141272+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 39542784 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:58.142536+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125485056 unmapped: 39247872 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:59.143626+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125485056 unmapped: 39247872 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:00.144008+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125485056 unmapped: 39247872 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1384645 data_alloc: 234881024 data_used: 19349504
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:01.144193+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8ea1000/0x0/0x4ffc00000, data 0x270cd96/0x27cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125485056 unmapped: 39247872 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:02.144968+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125485056 unmapped: 39247872 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x5592270fc000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.293542862s of 14.293901443s, submitted: 116
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632400 session 0x559226f250e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b08800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b08800 session 0x55922677f2c0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:03.145467+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:04.146104+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:05.146389+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162987 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:06.146550+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:07.146818+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:08.147174+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:09.147492+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:10.147965+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:11.148196+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162987 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:12.148489+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:13.148668+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:14.148994+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:15.149136+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:16.149360+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162987 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:17.149709+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:18.150006+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:19.150278+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:20.150521+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:21.150765+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162987 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:22.151040+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:23.151268+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:24.151466+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d5800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d5800 session 0x5592272292c0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d4800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x559226fee3c0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922549dc00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x5592254efe00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226632400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632400 session 0x559226785c20
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b08800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.185562134s of 22.248020172s, submitted: 32
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b08800 session 0x5592254725a0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e6000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e6000 session 0x55922723c960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922549dc00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x55922657e1e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226632400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632400 session 0x5592270fde00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b08800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b08800 session 0x559225473680
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:25.151719+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 43679744 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:26.151970+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1209420 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 43679744 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:27.152239+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 43679744 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:28.152368+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 43679744 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226e3c800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226e3c800 session 0x559225017a40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa07c000/0x0/0x4ffc00000, data 0x1530da6/0x15f0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559225656800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559225656800 session 0x55922669f2c0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:29.152528+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 43679744 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922549dc00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x559226f24780
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226632400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632400 session 0x559226ffa3c0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:30.152662+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b08800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226e3c800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119414784 unmapped: 45318144 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:31.152819+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1216924 data_alloc: 234881024 data_used: 10543104
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119414784 unmapped: 45318144 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:32.153095+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:33.153270+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:34.153439+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa056000/0x0/0x4ffc00000, data 0x1554dd9/0x1616000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:35.153582+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:36.153745+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1246108 data_alloc: 234881024 data_used: 14667776
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:37.153931+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:38.154155+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa056000/0x0/0x4ffc00000, data 0x1554dd9/0x1616000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:39.154341+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:40.154558+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:41.154708+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1246108 data_alloc: 234881024 data_used: 14667776
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:42.155041+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.542800903s of 17.671800613s, submitted: 41
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122945536 unmapped: 41787392 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:43.155296+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 124780544 unmapped: 39952384 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9869000/0x0/0x4ffc00000, data 0x1d41dd9/0x1e03000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:44.155453+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125403136 unmapped: 39329792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:45.155633+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f97dc000/0x0/0x4ffc00000, data 0x1dc8dd9/0x1e8a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226864c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226864c00 session 0x55922677f860
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226863400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226863400 session 0x55922721f0e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922723b000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922723b000 session 0x559226aeb680
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922723b000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922723b000 session 0x559225016000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 124108800 unmapped: 40624128 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922549dc00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x55922669e1e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226632400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632400 session 0x55922657f4a0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226863400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226863400 session 0x559224649680
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226864c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226864c00 session 0x559224ef0f00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559225656c00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559225656c00 session 0x55922723cf00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:46.155808+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1386172 data_alloc: 234881024 data_used: 16515072
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 39067648 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:47.156059+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 39067648 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:48.156222+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9129000/0x0/0x4ffc00000, data 0x2477e4b/0x253b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 39034880 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:49.156369+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 39034880 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:50.156514+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 39034880 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226634400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226634400 session 0x55922723da40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:51.156643+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1379724 data_alloc: 234881024 data_used: 16515072
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922549c400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549c400 session 0x5592267852c0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 39034880 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af4400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4400 session 0x559225625680
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:52.156795+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261ecc00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ecc00 session 0x5592270fd860
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 39034880 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:53.156926+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261ecc00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.054047585s of 10.986348152s, submitted: 167
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f910f000/0x0/0x4ffc00000, data 0x2498e6e/0x255d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126746624 unmapped: 37986304 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:54.157067+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126754816 unmapped: 37978112 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f910f000/0x0/0x4ffc00000, data 0x2498e6e/0x255d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:55.157219+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128802816 unmapped: 35930112 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:56.157375+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1419509 data_alloc: 234881024 data_used: 22065152
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128811008 unmapped: 35921920 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:57.158185+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128811008 unmapped: 35921920 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:58.158337+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128811008 unmapped: 35921920 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f910f000/0x0/0x4ffc00000, data 0x2498e6e/0x255d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:59.158536+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128843776 unmapped: 35889152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:00.158670+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128843776 unmapped: 35889152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:01.158816+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1419509 data_alloc: 234881024 data_used: 22065152
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128843776 unmapped: 35889152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:02.158950+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128843776 unmapped: 35889152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:03.161591+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128843776 unmapped: 35889152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f910f000/0x0/0x4ffc00000, data 0x2498e6e/0x255d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.671233177s of 10.673833847s, submitted: 1
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:04.162596+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128876544 unmapped: 35856384 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f910c000/0x0/0x4ffc00000, data 0x249be6e/0x2560000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:05.162782+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 133996544 unmapped: 30736384 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:06.163333+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1494901 data_alloc: 234881024 data_used: 23621632
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 133668864 unmapped: 31064064 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:07.163990+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134979584 unmapped: 29753344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:08.164089+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134979584 unmapped: 29753344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:09.164584+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134979584 unmapped: 29753344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:10.164786+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134979584 unmapped: 29753344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88d4000/0x0/0x4ffc00000, data 0x2cd3e6e/0x2d98000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:11.165017+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1505309 data_alloc: 234881024 data_used: 24363008
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134987776 unmapped: 29745152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:12.165291+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134987776 unmapped: 29745152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:13.165531+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134987776 unmapped: 29745152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88d4000/0x0/0x4ffc00000, data 0x2cd3e6e/0x2d98000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:14.166028+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134987776 unmapped: 29745152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:15.166333+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88d4000/0x0/0x4ffc00000, data 0x2cd3e6e/0x2d98000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134987776 unmapped: 29745152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:16.166579+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.098537445s of 12.288821220s, submitted: 94
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1502669 data_alloc: 234881024 data_used: 24371200
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135053312 unmapped: 29679616 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:17.166905+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:18.167068+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88d2000/0x0/0x4ffc00000, data 0x2cd4e6e/0x2d99000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88d2000/0x0/0x4ffc00000, data 0x2cd4e6e/0x2d99000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:19.167363+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:20.167631+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:21.167870+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1503037 data_alloc: 234881024 data_used: 24436736
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:22.168022+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88cd000/0x0/0x4ffc00000, data 0x2cdae6e/0x2d9f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:23.168245+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:24.168399+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ecc00 session 0x55922669f2c0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226f23400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135069696 unmapped: 29663232 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:25.168548+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135094272 unmapped: 29638656 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:26.168780+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1348975 data_alloc: 234881024 data_used: 16515072
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 7.806308746s of 10.239793777s, submitted: 37
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88ce000/0x0/0x4ffc00000, data 0x2cdae5e/0x2d9e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,2])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:27.168946+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9663000/0x0/0x4ffc00000, data 0x1f46dfc/0x2009000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:28.169135+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9663000/0x0/0x4ffc00000, data 0x1f46dfc/0x2009000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:29.169339+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226f23400 session 0x559226784f00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9663000/0x0/0x4ffc00000, data 0x1f46dd9/0x2008000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:30.169550+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:31.169891+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1347011 data_alloc: 234881024 data_used: 16498688
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:32.170069+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:33.170185+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9663000/0x0/0x4ffc00000, data 0x1f46dd9/0x2008000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:34.170376+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:35.170566+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:36.170775+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1347011 data_alloc: 234881024 data_used: 16498688
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9663000/0x0/0x4ffc00000, data 0x1f46dd9/0x2008000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:37.171047+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132038656 unmapped: 32694272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:38.171300+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132038656 unmapped: 32694272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:39.171772+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132038656 unmapped: 32694272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:40.172212+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.272031784s of 13.655331612s, submitted: 34
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b08800 session 0x55922546f0e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132055040 unmapped: 32677888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9663000/0x0/0x4ffc00000, data 0x1f46dd9/0x2008000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:41.172367+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1346027 data_alloc: 234881024 data_used: 16498688
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226e3c800 session 0x5592254ef0e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132055040 unmapped: 32677888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:42.172686+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132055040 unmapped: 32677888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226865000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:43.172875+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127320064 unmapped: 37412864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:44.173013+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:45.173377+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:46.173608+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1199806 data_alloc: 234881024 data_used: 10649600
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa48d000/0x0/0x4ffc00000, data 0x111fdb9/0x11df000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:47.173914+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226865000 session 0x559223b86960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:48.174123+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:49.174383+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:50.174518+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:51.174920+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:52.175262+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:53.175428+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:54.175700+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:55.176033+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:56.176192+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:57.176552+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:58.176856+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:59.177052+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:00.177252+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:01.177491+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:02.177659+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:03.177809+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:04.177959+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:05.178129+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:06.178295+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:07.178519+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:08.178681+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:09.179091+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:10.179472+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:11.179820+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127352832 unmapped: 37380096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:12.180247+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127361024 unmapped: 37371904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:13.180636+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127361024 unmapped: 37371904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:14.181002+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127361024 unmapped: 37371904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:15.181311+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127361024 unmapped: 37371904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:16.181783+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127361024 unmapped: 37371904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:17.182227+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127361024 unmapped: 37371904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:18.182435+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127369216 unmapped: 37363712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:19.182785+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127369216 unmapped: 37363712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:20.183086+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127369216 unmapped: 37363712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:21.183381+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127369216 unmapped: 37363712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:22.183611+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127369216 unmapped: 37363712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261ef000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ef000 session 0x559226aea3c0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559225657800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559225657800 session 0x5592263feb40
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559225657400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559225657400 session 0x559226fef0e0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226e3ac00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226e3ac00 session 0x55922669f680
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922676f400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 36.472564697s of 42.319786072s, submitted: 56
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:23.183905+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127352832 unmapped: 37380096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:24.184153+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127352832 unmapped: 37380096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:25.184272+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127377408 unmapped: 37355520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:26.184481+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127959040 unmapped: 36773888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676f400 session 0x5592247d85a0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559225657400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559225657400 session 0x5592263fe780
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559225657800
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559225657800 session 0x55922721fe00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261ef000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156edcf/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1234189 data_alloc: 234881024 data_used: 10539008
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:27.184803+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127959040 unmapped: 36773888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:28.185019+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127959040 unmapped: 36773888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ef000 session 0x559224648000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226e3ac00
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226e3ac00 session 0x559226fee960
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:29.185195+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127959040 unmapped: 36773888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:30.185385+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127959040 unmapped: 36773888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b0a000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:31.185526+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127967232 unmapped: 36765696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1234133 data_alloc: 234881024 data_used: 10539008
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:32.185784+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127967232 unmapped: 36765696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,0,0,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:33.185991+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127967232 unmapped: 36765696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:34.186209+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 2.432877302s of 11.771712303s, submitted: 33
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127967232 unmapped: 36765696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b0a000 session 0x559225473c20
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:35.186421+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127991808 unmapped: 36741120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261f1400
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226f22000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:36.186598+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127991808 unmapped: 36741120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1235130 data_alloc: 234881024 data_used: 10539008
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:37.186933+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127991808 unmapped: 36741120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:38.187185+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127967232 unmapped: 36765696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:39.187411+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127385600 unmapped: 37347328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:40.187661+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127385600 unmapped: 37347328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:41.187813+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127385600 unmapped: 37347328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1265986 data_alloc: 234881024 data_used: 15028224
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:42.187952+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127385600 unmapped: 37347328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:43.188140+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127385600 unmapped: 37347328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:44.188348+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127393792 unmapped: 37339136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:45.188601+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127393792 unmapped: 37339136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:46.188775+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127393792 unmapped: 37339136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1265986 data_alloc: 234881024 data_used: 15028224
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:47.188965+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127393792 unmapped: 37339136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:48.189114+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127393792 unmapped: 37339136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:49.189257+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.148657799s of 15.006252289s, submitted: 6
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127393792 unmapped: 37339136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:50.189376+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132644864 unmapped: 32088064 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:51.189489+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128950272 unmapped: 35782656 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1302734 data_alloc: 234881024 data_used: 15024128
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:52.189639+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b43000/0x0/0x4ffc00000, data 0x1a68e08/0x1b29000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,2,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128950272 unmapped: 35782656 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:53.189838+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130285568 unmapped: 34447360 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:54.190054+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129679360 unmapped: 35053568 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f997a000/0x0/0x4ffc00000, data 0x1c29e08/0x1cea000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,4])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:55.190268+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129687552 unmapped: 35045376 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:56.190389+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1328778 data_alloc: 234881024 data_used: 15020032
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:57.190598+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f96ff000/0x0/0x4ffc00000, data 0x1eace08/0x1f6d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130875392 unmapped: 33857536 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:58.190893+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130883584 unmapped: 33849344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:59.191074+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130883584 unmapped: 33849344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 1.013013244s of 10.206089973s, submitted: 61
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f96ff000/0x0/0x4ffc00000, data 0x1eace08/0x1f6d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:00.191267+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130883584 unmapped: 33849344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:01.191438+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130883584 unmapped: 33849344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1333106 data_alloc: 234881024 data_used: 15360000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:02.191645+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130883584 unmapped: 33849344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:03.191826+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130916352 unmapped: 33816576 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:04.191977+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130916352 unmapped: 33816576 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:05.192135+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130916352 unmapped: 33816576 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f96e2000/0x0/0x4ffc00000, data 0x1ec9e08/0x1f8a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:06.192282+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130916352 unmapped: 33816576 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1348124 data_alloc: 234881024 data_used: 15777792
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:07.192507+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130916352 unmapped: 33816576 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:08.192670+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130932736 unmapped: 33800192 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:09.192863+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130932736 unmapped: 33800192 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:10.193002+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130932736 unmapped: 33800192 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.314796448s of 11.320782661s, submitted: 31
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261f1400 session 0x55922669fc20
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226f22000 session 0x559226fef4a0
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afa000
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:11.193125+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129081344 unmapped: 35651584 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f96e2000/0x0/0x4ffc00000, data 0x1ec9e08/0x1f8a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1207674 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:12.193544+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129089536 unmapped: 35643392 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afa000 session 0x559226aeb680
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:13.194324+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:14.194848+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:15.195121+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:16.195434+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:17.195659+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:18.196075+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:19.196424+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:20.197008+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:21.197315+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:22.197634+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:23.197937+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:24.198105+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:25.198240+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:26.198433+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:27.198656+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:28.198902+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:29.199052+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:30.199199+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:31.199362+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:32.199582+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:33.199816+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:34.200022+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:35.200192+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:36.200361+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:37.200650+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:38.200904+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:39.201087+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:40.201312+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:41.201544+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:42.201715+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:43.202100+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:44.202294+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:45.202483+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:46.202681+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:47.202928+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:48.203124+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:49.203322+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:50.203538+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:51.203708+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:52.203888+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:53.204029+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:54.204200+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:55.204411+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:56.204560+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:57.204743+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:58.204910+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129146880 unmapped: 35586048 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:59.205046+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129146880 unmapped: 35586048 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:00.205210+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129146880 unmapped: 35586048 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:01.205379+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129146880 unmapped: 35586048 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:02.205561+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129146880 unmapped: 35586048 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:03.205821+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:04.205950+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:05.206109+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:06.206297+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:07.206536+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:08.206747+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:09.206921+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:10.207075+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:11.207194+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:12.207339+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:13.207517+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:14.207808+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129163264 unmapped: 35569664 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:15.208018+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129163264 unmapped: 35569664 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:16.208227+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129163264 unmapped: 35569664 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:17.208477+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:18.208667+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:19.208835+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:20.209137+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:21.209317+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:22.209500+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:23.209650+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:24.210296+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:25.210594+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:26.210895+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:27.211769+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129179648 unmapped: 35553280 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:28.211976+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129179648 unmapped: 35553280 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:29.212174+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:30.212416+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:31.212627+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:32.212841+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:33.213062+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:34.213278+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:35.213561+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:36.213916+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:37.214194+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:38.214433+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:39.214653+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:40.214831+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:41.215011+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:42.215136+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:16 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:16 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129228800 unmapped: 35504128 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:43.280094+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: do_command 'config diff' '{prefix=config diff}'
Jan 23 10:30:16 compute-2 ceph-osd[81231]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 23 10:30:16 compute-2 ceph-osd[81231]: do_command 'config show' '{prefix=config show}'
Jan 23 10:30:16 compute-2 ceph-osd[81231]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 23 10:30:16 compute-2 ceph-osd[81231]: do_command 'counter dump' '{prefix=counter dump}'
Jan 23 10:30:16 compute-2 ceph-osd[81231]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 23 10:30:16 compute-2 ceph-osd[81231]: do_command 'counter schema' '{prefix=counter schema}'
Jan 23 10:30:16 compute-2 ceph-osd[81231]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:44.280489+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128770048 unmapped: 35962880 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:30:16 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:45.280717+0000)
Jan 23 10:30:16 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128917504 unmapped: 35815424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:30:16 compute-2 ceph-osd[81231]: do_command 'log dump' '{prefix=log dump}'
Jan 23 10:30:16 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:30:16 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4224081289' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:30:16 compute-2 nova_compute[225701]: 2026-01-23 10:30:16.654 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.775s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:30:16 compute-2 nova_compute[225701]: 2026-01-23 10:30:16.660 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:30:16 compute-2 nova_compute[225701]: 2026-01-23 10:30:16.686 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:30:16 compute-2 nova_compute[225701]: 2026-01-23 10:30:16.688 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:30:16 compute-2 nova_compute[225701]: 2026-01-23 10:30:16.688 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.961s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:30:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:17 compute-2 ceph-mon[75771]: from='client.16518 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:17 compute-2 ceph-mon[75771]: from='client.25987 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:17 compute-2 ceph-mon[75771]: from='client.26048 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:17 compute-2 ceph-mon[75771]: from='client.16539 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:17 compute-2 ceph-mon[75771]: from='client.25993 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:17 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3107004277' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:30:17 compute-2 ceph-mon[75771]: from='client.26002 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:17 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2029648398' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:30:17 compute-2 ceph-mon[75771]: from='client.26066 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:17 compute-2 ceph-mon[75771]: from='client.16548 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:17 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2687778899' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 23 10:30:17 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/4004933432' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:30:17 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/129791328' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:30:17 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1506185174' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 23 10:30:17 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/4224081289' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:30:17 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3938838710' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:30:17 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Jan 23 10:30:17 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/737140567' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 23 10:30:17 compute-2 nova_compute[225701]: 2026-01-23 10:30:17.111 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:17.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:30:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:17.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:30:17 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 23 10:30:17 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/232885380' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:30:17 compute-2 nova_compute[225701]: 2026-01-23 10:30:17.689 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:30:17 compute-2 nova_compute[225701]: 2026-01-23 10:30:17.690 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:30:17 compute-2 nova_compute[225701]: 2026-01-23 10:30:17.690 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:30:17 compute-2 crontab[240608]: (root) LIST (root)
Jan 23 10:30:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:18 compute-2 ceph-mon[75771]: from='client.26084 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:18 compute-2 ceph-mon[75771]: from='client.16569 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:18 compute-2 ceph-mon[75771]: from='client.26029 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:18 compute-2 ceph-mon[75771]: from='client.26032 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:18 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/737140567' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 23 10:30:18 compute-2 ceph-mon[75771]: pgmap v1115: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:30:18 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3169827845' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:30:18 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/873021322' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 23 10:30:18 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/232885380' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:30:18 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/753443402' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 23 10:30:18 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/790870186' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 23 10:30:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:18 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 23 10:30:18 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/641666540' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:30:18 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:30:18 compute-2 nova_compute[225701]: 2026-01-23 10:30:18.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:30:18 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 23 10:30:18 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2869222915' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:30:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:19 compute-2 nova_compute[225701]: 2026-01-23 10:30:19.344 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:19.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:19 compute-2 ceph-mon[75771]: from='client.26111 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:19 compute-2 ceph-mon[75771]: from='client.26056 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:19 compute-2 ceph-mon[75771]: from='client.16587 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:19 compute-2 ceph-mon[75771]: from='client.26126 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:19 compute-2 ceph-mon[75771]: from='client.26071 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:19 compute-2 ceph-mon[75771]: from='client.16593 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:19 compute-2 ceph-mon[75771]: from='client.26141 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:19 compute-2 ceph-mon[75771]: from='client.26089 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:19 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2692355636' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 23 10:30:19 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/641666540' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:30:19 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2848004950' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 23 10:30:19 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2869222915' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:30:19 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3337747780' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 23 10:30:19 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3799754458' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 23 10:30:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:19.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:19 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 23 10:30:19 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3233910781' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:30:19 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Jan 23 10:30:19 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3682934773' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:20 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 23 10:30:20 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/341034865' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:30:20 compute-2 ceph-mon[75771]: from='client.16617 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:20 compute-2 ceph-mon[75771]: from='client.26156 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:20 compute-2 ceph-mon[75771]: from='client.26098 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:20 compute-2 ceph-mon[75771]: from='client.16635 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:20 compute-2 ceph-mon[75771]: from='client.26113 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:20 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1425965098' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 23 10:30:20 compute-2 ceph-mon[75771]: pgmap v1116: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:20 compute-2 ceph-mon[75771]: from='client.16653 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:20 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/457950240' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 23 10:30:20 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/641995280' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 23 10:30:20 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3233910781' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:30:20 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2946892865' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 23 10:30:20 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1463104035' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 23 10:30:20 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3682934773' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:30:20 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Jan 23 10:30:20 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1621252379' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 23 10:30:20 compute-2 nova_compute[225701]: 2026-01-23 10:30:20.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:30:20 compute-2 nova_compute[225701]: 2026-01-23 10:30:20.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:30:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:21 compute-2 systemd[1]: Starting Hostname Service...
Jan 23 10:30:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:21 compute-2 systemd[1]: Started Hostname Service.
Jan 23 10:30:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:21.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:30:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:21.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:30:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:22 compute-2 nova_compute[225701]: 2026-01-23 10:30:22.114 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:23.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:23.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:23 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:30:23 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Jan 23 10:30:23 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2294747835' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 23 10:30:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:24 compute-2 ceph-mon[75771]: from='client.16668 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:24 compute-2 ceph-mon[75771]: from='client.26137 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:24 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2112946453' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 23 10:30:24 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1755085481' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 23 10:30:24 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/341034865' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:30:24 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/331196613' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 10:30:24 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1013832290' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 23 10:30:24 compute-2 ceph-mon[75771]: from='client.26158 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:24 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/256833078' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 23 10:30:24 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/4062298466' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:24 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1621252379' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 23 10:30:24 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/758305960' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 23 10:30:24 compute-2 ceph-mon[75771]: from='client.26173 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:24 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1191692492' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 23 10:30:24 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/15662595' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 23 10:30:24 compute-2 ceph-mon[75771]: pgmap v1117: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:30:24 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1439010025' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 23 10:30:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:24 compute-2 nova_compute[225701]: 2026-01-23 10:30:24.346 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:24 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Jan 23 10:30:24 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2182990666' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 23 10:30:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:25.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:25.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:26 compute-2 ceph-mon[75771]: from='client.26231 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:26 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1786861485' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 23 10:30:26 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2098994895' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 23 10:30:26 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2493257669' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 23 10:30:26 compute-2 ceph-mon[75771]: from='client.26243 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:26 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1879524475' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 10:30:26 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2855196845' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 23 10:30:26 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1569001041' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 23 10:30:26 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/526295899' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 23 10:30:26 compute-2 ceph-mon[75771]: from='client.16767 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:26 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1090320711' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 23 10:30:26 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3694179319' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 23 10:30:26 compute-2 ceph-mon[75771]: pgmap v1118: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:26 compute-2 ceph-mon[75771]: from='client.16779 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:26 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3613864487' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 23 10:30:26 compute-2 ceph-mon[75771]: from='client.16785 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:26 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2294747835' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 23 10:30:26 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1692553818' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:30:26 compute-2 ceph-mon[75771]: from='client.16797 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:26 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2531849906' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 23 10:30:26 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Jan 23 10:30:26 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4080595786' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:26 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:30:26 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:30:26 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:30:26 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:30:26 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:30:26 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:30:26 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:30:26 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:30:26 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Jan 23 10:30:26 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2610036103' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 23 10:30:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:27 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Jan 23 10:30:27 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3569729491' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:27 compute-2 nova_compute[225701]: 2026-01-23 10:30:27.200 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:27 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Jan 23 10:30:27 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/896818155' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3854792418' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3166484618' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='client.16809 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1648245877' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='client.16821 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2182990666' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: pgmap v1119: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='client.16836 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/340871671' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1855905813' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2836282561' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='client.16866 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='client.16887 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/4080595786' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2610036103' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='client.26321 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3569729491' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:30:27 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:30:27 compute-2 ceph-mon[75771]: pgmap v1120: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:30:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:27.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:27 compute-2 sudo[241339]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:30:27 compute-2 sudo[241339]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:30:27 compute-2 sudo[241339]: pam_unix(sudo:session): session closed for user root
Jan 23 10:30:27 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Jan 23 10:30:27 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2736972453' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:27.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:27 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Jan 23 10:30:27 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2037715614' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 23 10:30:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:28 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Jan 23 10:30:28 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/26149318' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 23 10:30:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:28 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Jan 23 10:30:28 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1004606240' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 23 10:30:28 compute-2 ceph-mon[75771]: from='client.26357 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:28 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/896818155' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 23 10:30:28 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2246982503' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 23 10:30:28 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/931928461' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 23 10:30:28 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2736972453' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 23 10:30:28 compute-2 ceph-mon[75771]: from='client.26387 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:28 compute-2 ceph-mon[75771]: from='client.16956 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:28 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2037715614' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 23 10:30:28 compute-2 ceph-mon[75771]: from='client.26393 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:28 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/26149318' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 23 10:30:28 compute-2 ceph-mon[75771]: from='client.26399 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:28 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1004606240' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 23 10:30:28 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/292578968' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 23 10:30:28 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3039266935' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 23 10:30:28 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Jan 23 10:30:28 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4061140840' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 23 10:30:28 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Jan 23 10:30:28 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/631308360' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 23 10:30:28 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:30:28 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Jan 23 10:30:28 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1961868252' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 23 10:30:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:29 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Jan 23 10:30:29 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3026965908' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 23 10:30:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:29 compute-2 nova_compute[225701]: 2026-01-23 10:30:29.347 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:29 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Jan 23 10:30:29 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1997317476' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 10:30:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:29.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:29 compute-2 ceph-mon[75771]: from='client.26414 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:29 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/4061140840' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 23 10:30:29 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/631308360' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 23 10:30:29 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/584663929' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 23 10:30:29 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3849795310' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 23 10:30:29 compute-2 ceph-mon[75771]: from='client.26426 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:29 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1961868252' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 23 10:30:29 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3026965908' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 23 10:30:29 compute-2 ceph-mon[75771]: pgmap v1121: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:29 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2920006099' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Jan 23 10:30:29 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3687676278' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Jan 23 10:30:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:30:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:29.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:30:29 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Jan 23 10:30:29 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/629032271' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 23 10:30:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:31.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:30:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:31.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:30:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:32 compute-2 nova_compute[225701]: 2026-01-23 10:30:32.202 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:32 compute-2 ceph-mon[75771]: from='client.26438 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:32 compute-2 ceph-mon[75771]: from='client.26335 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:32 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1997317476' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 10:30:32 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1049295026' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Jan 23 10:30:32 compute-2 ceph-mon[75771]: from='client.26347 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:32 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3723003283' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Jan 23 10:30:32 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/629032271' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 23 10:30:32 compute-2 ceph-mon[75771]: from='client.26359 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:32 compute-2 ceph-mon[75771]: from='client.26450 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:33 compute-2 ceph-mon[75771]: from='client.16995 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:33 compute-2 ceph-mon[75771]: from='client.26365 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:33 compute-2 ceph-mon[75771]: from='client.26371 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:33 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1786797257' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Jan 23 10:30:33 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3321049884' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Jan 23 10:30:33 compute-2 ceph-mon[75771]: from='client.26380 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:33 compute-2 ceph-mon[75771]: pgmap v1122: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:30:33 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3112698172' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Jan 23 10:30:33 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1709192928' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Jan 23 10:30:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:33 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Jan 23 10:30:33 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3880234350' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 23 10:30:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:30:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:33.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:30:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:33.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:33 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:30:33 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Jan 23 10:30:33 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3916140662' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 23 10:30:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:34 compute-2 ceph-mon[75771]: from='client.26389 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:34 compute-2 ceph-mon[75771]: from='client.26395 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:34 compute-2 ceph-mon[75771]: from='client.26477 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:34 compute-2 ceph-mon[75771]: pgmap v1123: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:34 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3510438922' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Jan 23 10:30:34 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3880234350' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 23 10:30:34 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/368403482' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Jan 23 10:30:34 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3916140662' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 23 10:30:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:34 compute-2 nova_compute[225701]: 2026-01-23 10:30:34.348 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:34 compute-2 podman[241963]: 2026-01-23 10:30:34.658548426 +0000 UTC m=+0.060040057 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 23 10:30:34 compute-2 podman[241962]: 2026-01-23 10:30:34.698450683 +0000 UTC m=+0.099536205 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 10:30:34 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Jan 23 10:30:34 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3359416428' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 23 10:30:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:35 compute-2 ceph-mon[75771]: from='client.26404 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:35 compute-2 ceph-mon[75771]: from='client.17031 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:35 compute-2 ceph-mon[75771]: from='client.26416 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:35 compute-2 ceph-mon[75771]: from='client.26489 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:35 compute-2 ceph-mon[75771]: from='client.17040 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:35 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2654627831' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:30:35 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/965967208' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Jan 23 10:30:35 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2586594643' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Jan 23 10:30:35 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1237615592' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Jan 23 10:30:35 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3359416428' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 23 10:30:35 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2023447290' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Jan 23 10:30:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:30:35 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:30:35 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:30:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:35.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:35.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:35 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:30:35 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:30:35 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:30:35 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:30:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:36 compute-2 ceph-mon[75771]: from='client.26498 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:36 compute-2 ceph-mon[75771]: pgmap v1124: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:36 compute-2 ceph-mon[75771]: from='client.17085 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:36 compute-2 ceph-mon[75771]: from='client.26525 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:36 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:30:36 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:30:36 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:30:36 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:30:36 compute-2 ceph-mon[75771]: from='client.17094 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:36 compute-2 ceph-mon[75771]: from='client.26534 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:36 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:30:36 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:30:36 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:30:36 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:30:36 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/112016515' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Jan 23 10:30:36 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/514541159' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Jan 23 10:30:36 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:30:36 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:30:36 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Jan 23 10:30:36 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/25636672' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 23 10:30:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:37 compute-2 nova_compute[225701]: 2026-01-23 10:30:37.204 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:37.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:37 compute-2 ovs-appctl[242834]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 23 10:30:37 compute-2 ovs-appctl[242839]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 23 10:30:37 compute-2 ovs-appctl[242843]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 23 10:30:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:37.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:37 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Jan 23 10:30:37 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1362458980' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 23 10:30:37 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2493126188' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Jan 23 10:30:37 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3084282709' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Jan 23 10:30:37 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/25636672' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 23 10:30:37 compute-2 ceph-mon[75771]: from='client.26567 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:37 compute-2 ceph-mon[75771]: from='client.17130 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:37 compute-2 ceph-mon[75771]: from='client.26573 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:37 compute-2 ceph-mon[75771]: pgmap v1125: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:30:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:38 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Jan 23 10:30:38 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/419775306' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 23 10:30:38 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0)
Jan 23 10:30:38 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/372859611' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Jan 23 10:30:38 compute-2 ceph-mon[75771]: from='client.26579 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:38 compute-2 ceph-mon[75771]: from='client.17139 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:38 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1362458980' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 23 10:30:38 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1176070681' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 10:30:38 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2010755244' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 10:30:38 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/419775306' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 23 10:30:38 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/571241287' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Jan 23 10:30:38 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2274561256' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Jan 23 10:30:38 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2918126809' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:38 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/372859611' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Jan 23 10:30:38 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3820475493' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:38 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:30:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:39 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0)
Jan 23 10:30:39 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1920671338' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Jan 23 10:30:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:39 compute-2 nova_compute[225701]: 2026-01-23 10:30:39.349 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:39.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:39.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:40 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0)
Jan 23 10:30:40 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/50610532' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Jan 23 10:30:40 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:30:40 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 2400.0 total, 600.0 interval
                                           Cumulative writes: 6595 writes, 34K keys, 6595 commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.04 MB/s
                                           Cumulative WAL: 6594 writes, 6594 syncs, 1.00 writes per sync, written: 0.09 GB, 0.04 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1522 writes, 7535 keys, 1522 commit groups, 1.0 writes per commit group, ingest: 17.69 MB, 0.03 MB/s
                                           Interval WAL: 1521 writes, 1521 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     22.9      2.13              0.20        18    0.118       0      0       0.0       0.0
                                             L6      1/0   13.93 MB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   4.4     84.8     72.8      2.92              0.88        17    0.172     94K   9319       0.0       0.0
                                            Sum      1/0   13.93 MB   0.0      0.2     0.0      0.2       0.3      0.1       0.0   5.4     49.0     51.8      5.06              1.08        35    0.144     94K   9319       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   5.6     24.9     25.6      2.49              0.20         8    0.311     26K   2540       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   0.0     84.8     72.8      2.92              0.88        17    0.172     94K   9319       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     23.0      2.13              0.20        17    0.125       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 2400.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.048, interval 0.011
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.26 GB write, 0.11 MB/s write, 0.24 GB read, 0.10 MB/s read, 5.1 seconds
                                           Interval compaction: 0.06 GB write, 0.11 MB/s write, 0.06 GB read, 0.10 MB/s read, 2.5 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c6513709b0#2 capacity: 304.00 MB usage: 22.75 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.000186 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1398,22.02 MB,7.24371%) FilterBlock(35,274.42 KB,0.0881546%) IndexBlock(35,473.86 KB,0.152221%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Jan 23 10:30:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:40 compute-2 ceph-mon[75771]: from='client.26606 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:40 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1920671338' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Jan 23 10:30:40 compute-2 ceph-mon[75771]: from='client.17172 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:40 compute-2 ceph-mon[75771]: pgmap v1126: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:40 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1796333743' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:30:40 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/49428748' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:30:40 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump"} v 0)
Jan 23 10:30:40 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1965759482' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Jan 23 10:30:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:41.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:41 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls"} v 0)
Jan 23 10:30:41 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/22422002' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Jan 23 10:30:41 compute-2 ceph-mon[75771]: from='client.26590 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:41 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/4128500998' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Jan 23 10:30:41 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/50610532' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Jan 23 10:30:41 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1655420808' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Jan 23 10:30:41 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2904729046' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:41 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2704307379' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:41 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1965759482' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Jan 23 10:30:41 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2550268996' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Jan 23 10:30:41 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2288402487' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Jan 23 10:30:41 compute-2 ceph-mon[75771]: pgmap v1127: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:30:41 compute-2 ceph-mon[75771]: from='client.26623 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:41 compute-2 ceph-mon[75771]: from='client.17208 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:41.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:42 compute-2 nova_compute[225701]: 2026-01-23 10:30:42.206 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:42 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd dump"} v 0)
Jan 23 10:30:42 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1755609551' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Jan 23 10:30:42 compute-2 ceph-mon[75771]: from='client.26642 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:42 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/22422002' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Jan 23 10:30:42 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2641249001' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Jan 23 10:30:42 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/560283955' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Jan 23 10:30:42 compute-2 ceph-mon[75771]: from='client.26641 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:42 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/114055830' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:42 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3879520383' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:43 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd numa-status"} v 0)
Jan 23 10:30:43 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3090469002' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Jan 23 10:30:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:43.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:30:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:43.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:30:43 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:30:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:44 compute-2 ceph-mon[75771]: from='client.26650 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:44 compute-2 ceph-mon[75771]: from='client.17229 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:44 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1755609551' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Jan 23 10:30:44 compute-2 ceph-mon[75771]: from='client.26663 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:44 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/946102600' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Jan 23 10:30:44 compute-2 ceph-mon[75771]: pgmap v1128: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:44 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3090469002' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Jan 23 10:30:44 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3587918853' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Jan 23 10:30:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:44 compute-2 nova_compute[225701]: 2026-01-23 10:30:44.392 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:44 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail"} v 0)
Jan 23 10:30:44 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4118000991' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Jan 23 10:30:44 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd stat"} v 0)
Jan 23 10:30:44 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3681758551' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Jan 23 10:30:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:45 compute-2 sudo[244687]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:30:45 compute-2 sudo[244687]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:30:45 compute-2 sudo[244687]: pam_unix(sudo:session): session closed for user root
Jan 23 10:30:45 compute-2 ceph-mon[75771]: from='client.26674 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:45 compute-2 ceph-mon[75771]: from='client.17244 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:45 compute-2 ceph-mon[75771]: from='client.26675 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:45 compute-2 ceph-mon[75771]: from='client.26683 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:45 compute-2 ceph-mon[75771]: from='client.17250 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:45 compute-2 ceph-mon[75771]: from='client.26681 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:45 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3100141440' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:45 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/4118000991' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Jan 23 10:30:45 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1756811035' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:45 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2020289235' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Jan 23 10:30:45 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3681758551' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Jan 23 10:30:45 compute-2 sudo[244723]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:30:45 compute-2 sudo[244723]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:30:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:45.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:45 compute-2 sudo[244723]: pam_unix(sudo:session): session closed for user root
Jan 23 10:30:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:45.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:46 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Jan 23 10:30:46 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1942062714' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 10:30:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:47 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/634711209' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Jan 23 10:30:47 compute-2 ceph-mon[75771]: pgmap v1129: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:47 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:30:47 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:30:47 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0)
Jan 23 10:30:47 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3530506813' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Jan 23 10:30:47 compute-2 nova_compute[225701]: 2026-01-23 10:30:47.208 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:30:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:47.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:30:47 compute-2 sudo[244931]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:30:47 compute-2 sudo[244931]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:30:47 compute-2 sudo[244931]: pam_unix(sudo:session): session closed for user root
Jan 23 10:30:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:30:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:47.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:30:47 compute-2 ceph-mon[75771]: from='client.17274 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:47 compute-2 ceph-mon[75771]: from='client.26710 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:47 compute-2 ceph-mon[75771]: from='client.26705 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:47 compute-2 ceph-mon[75771]: from='client.17280 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:47 compute-2 ceph-mon[75771]: from='client.26714 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:47 compute-2 ceph-mon[75771]: from='client.26716 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:47 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2973156896' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:30:47 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1942062714' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 10:30:47 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1869920413' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Jan 23 10:30:47 compute-2 ceph-mon[75771]: from='client.17301 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:47 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3530506813' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Jan 23 10:30:47 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:30:47 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:30:47 compute-2 ceph-mon[75771]: pgmap v1130: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 769 B/s rd, 0 op/s
Jan 23 10:30:47 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1725430918' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:30:47 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:30:47 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:30:47 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:30:47 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:30:47 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:30:47 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3995885793' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:47 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1993038481' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Jan 23 10:30:47 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1944953467' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Jan 23 10:30:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:48 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail", "format": "json-pretty"} v 0)
Jan 23 10:30:48 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2304269384' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:30:48 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:30:48 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json-pretty"} v 0)
Jan 23 10:30:48 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3049937054' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Jan 23 10:30:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:49 compute-2 nova_compute[225701]: 2026-01-23 10:30:49.392 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:30:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:49.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:30:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:49.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:49 compute-2 virtqemud[225221]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 23 10:30:49 compute-2 ceph-mon[75771]: from='client.17307 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:49 compute-2 ceph-mon[75771]: from='client.26740 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:49 compute-2 ceph-mon[75771]: from='client.26741 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/955048904' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Jan 23 10:30:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2304269384' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:30:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/2448494540' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:30:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/2448494540' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:30:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3049937054' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Jan 23 10:30:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3762032723' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Jan 23 10:30:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:50 compute-2 systemd[1]: Starting Time & Date Service...
Jan 23 10:30:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:50 compute-2 systemd[1]: Started Time & Date Service.
Jan 23 10:30:50 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs ls", "format": "json-pretty"} v 0)
Jan 23 10:30:50 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3027541159' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Jan 23 10:30:50 compute-2 ceph-mon[75771]: from='client.26747 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:50 compute-2 ceph-mon[75771]: pgmap v1131: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 513 B/s rd, 0 op/s
Jan 23 10:30:50 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/966591196' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:50 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/736282130' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Jan 23 10:30:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:30:50 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3027541159' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Jan 23 10:30:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:51 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat", "format": "json-pretty"} v 0)
Jan 23 10:30:51 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2631473347' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Jan 23 10:30:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:51.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:51 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0)
Jan 23 10:30:51 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3298491043' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:30:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:51.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:30:51 compute-2 ceph-mon[75771]: from='client.26785 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:51 compute-2 ceph-mon[75771]: pgmap v1132: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 769 B/s rd, 0 op/s
Jan 23 10:30:51 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2631473347' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Jan 23 10:30:51 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3298491043' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:52 compute-2 nova_compute[225701]: 2026-01-23 10:30:52.211 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:52 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0)
Jan 23 10:30:52 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/814440401' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Jan 23 10:30:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:53 compute-2 ceph-mon[75771]: from='client.26803 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:53 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/814440401' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Jan 23 10:30:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:53 compute-2 sudo[245679]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:30:53 compute-2 sudo[245679]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:30:53 compute-2 sudo[245679]: pam_unix(sudo:session): session closed for user root
Jan 23 10:30:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:30:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:53.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:30:53 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd dump", "format": "json-pretty"} v 0)
Jan 23 10:30:53 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1998309136' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:30:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:53.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:30:53 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:30:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:54 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd numa-status", "format": "json-pretty"} v 0)
Jan 23 10:30:54 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2533419223' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Jan 23 10:30:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:54 compute-2 nova_compute[225701]: 2026-01-23 10:30:54.395 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:54 compute-2 ceph-mon[75771]: from='client.26815 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:54 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:30:54 compute-2 ceph-mon[75771]: pgmap v1133: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 512 B/s rd, 0 op/s
Jan 23 10:30:54 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:30:54 compute-2 ceph-mon[75771]: from='client.26821 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:54 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1998309136' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:54 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2533419223' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Jan 23 10:30:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:30:55.499 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:30:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:30:55.500 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:30:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:30:55.501 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:30:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:55.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:55 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0)
Jan 23 10:30:55 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1831285676' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:30:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:30:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:55.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:30:55 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0)
Jan 23 10:30:55 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2165199586' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Jan 23 10:30:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:56 compute-2 ceph-mon[75771]: from='client.26839 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:56 compute-2 ceph-mon[75771]: pgmap v1134: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 513 B/s rd, 0 op/s
Jan 23 10:30:56 compute-2 ceph-mon[75771]: from='client.26845 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:57 compute-2 nova_compute[225701]: 2026-01-23 10:30:57.214 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:30:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:57.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:30:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:30:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:57.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:30:57 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1831285676' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:30:57 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2165199586' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Jan 23 10:30:57 compute-2 ceph-mon[75771]: pgmap v1135: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 769 B/s rd, 0 op/s
Jan 23 10:30:57 compute-2 ceph-mon[75771]: from='client.26863 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:58 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0)
Jan 23 10:30:58 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4212707187' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Jan 23 10:30:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:58 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status", "format": "json-pretty"} v 0)
Jan 23 10:30:58 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4094141977' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Jan 23 10:30:58 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:30:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:30:59 compute-2 ceph-mon[75771]: from='client.26869 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:59 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/4212707187' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Jan 23 10:30:59 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/4094141977' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Jan 23 10:30:59 compute-2 nova_compute[225701]: 2026-01-23 10:30:59.435 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:59.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:30:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:59.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:00 compute-2 ceph-mon[75771]: pgmap v1136: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:01.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:01.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:02 compute-2 ceph-mon[75771]: pgmap v1137: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:31:02 compute-2 nova_compute[225701]: 2026-01-23 10:31:02.216 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:03 compute-2 ceph-mon[75771]: pgmap v1138: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:03.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:31:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:03.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:31:03 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:31:03 compute-2 nova_compute[225701]: 2026-01-23 10:31:03.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:03 compute-2 nova_compute[225701]: 2026-01-23 10:31:03.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 23 10:31:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:04 compute-2 nova_compute[225701]: 2026-01-23 10:31:04.482 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:31:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:05.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:05 compute-2 podman[245946]: 2026-01-23 10:31:05.636614886 +0000 UTC m=+0.055599738 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:31:05 compute-2 podman[245945]: 2026-01-23 10:31:05.668050853 +0000 UTC m=+0.087347182 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 10:31:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:05.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:06 compute-2 ceph-mon[75771]: pgmap v1139: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:06 compute-2 nova_compute[225701]: 2026-01-23 10:31:06.800 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:07 compute-2 nova_compute[225701]: 2026-01-23 10:31:07.218 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:07.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:07.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:07 compute-2 sudo[245990]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:31:07 compute-2 sudo[245990]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:31:07 compute-2 sudo[245990]: pam_unix(sudo:session): session closed for user root
Jan 23 10:31:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:08 compute-2 ceph-mon[75771]: pgmap v1140: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:31:08 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:31:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:09 compute-2 nova_compute[225701]: 2026-01-23 10:31:09.521 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:31:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:09.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:31:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:09.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:09 compute-2 ceph-mon[75771]: pgmap v1141: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:10 compute-2 nova_compute[225701]: 2026-01-23 10:31:10.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:10 compute-2 nova_compute[225701]: 2026-01-23 10:31:10.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:31:10 compute-2 nova_compute[225701]: 2026-01-23 10:31:10.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:31:10 compute-2 nova_compute[225701]: 2026-01-23 10:31:10.800 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:31:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:11.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:11.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:12 compute-2 nova_compute[225701]: 2026-01-23 10:31:12.220 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:12 compute-2 ceph-mon[75771]: pgmap v1142: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:31:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:13 compute-2 ceph-mon[75771]: pgmap v1143: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:31:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:13.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:31:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:13.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:13 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:31:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:14 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2905912035' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:31:14 compute-2 nova_compute[225701]: 2026-01-23 10:31:14.524 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:14 compute-2 nova_compute[225701]: 2026-01-23 10:31:14.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:15.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:15.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:15 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/147924141' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:31:15 compute-2 ceph-mon[75771]: pgmap v1144: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:15 compute-2 nova_compute[225701]: 2026-01-23 10:31:15.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:16 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/4255286849' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:31:16 compute-2 nova_compute[225701]: 2026-01-23 10:31:16.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:16 compute-2 nova_compute[225701]: 2026-01-23 10:31:16.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:16 compute-2 nova_compute[225701]: 2026-01-23 10:31:16.825 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:31:16 compute-2 nova_compute[225701]: 2026-01-23 10:31:16.825 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:31:16 compute-2 nova_compute[225701]: 2026-01-23 10:31:16.825 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:31:16 compute-2 nova_compute[225701]: 2026-01-23 10:31:16.826 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:31:16 compute-2 nova_compute[225701]: 2026-01-23 10:31:16.826 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:31:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:17 compute-2 nova_compute[225701]: 2026-01-23 10:31:17.222 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:17 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:31:17 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/177182351' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:31:17 compute-2 nova_compute[225701]: 2026-01-23 10:31:17.329 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:31:17 compute-2 nova_compute[225701]: 2026-01-23 10:31:17.463 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:31:17 compute-2 nova_compute[225701]: 2026-01-23 10:31:17.464 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4657MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:31:17 compute-2 nova_compute[225701]: 2026-01-23 10:31:17.465 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:31:17 compute-2 nova_compute[225701]: 2026-01-23 10:31:17.465 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:31:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:31:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:17.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:31:17 compute-2 nova_compute[225701]: 2026-01-23 10:31:17.647 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:31:17 compute-2 nova_compute[225701]: 2026-01-23 10:31:17.647 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:31:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:31:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:17.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:31:17 compute-2 nova_compute[225701]: 2026-01-23 10:31:17.717 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:31:17 compute-2 ceph-mon[75771]: pgmap v1145: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:31:17 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2961280680' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:31:17 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/177182351' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:31:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:18 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:31:18 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/339495991' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:31:18 compute-2 nova_compute[225701]: 2026-01-23 10:31:18.295 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.578s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:31:18 compute-2 nova_compute[225701]: 2026-01-23 10:31:18.301 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:31:18 compute-2 nova_compute[225701]: 2026-01-23 10:31:18.318 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:31:18 compute-2 nova_compute[225701]: 2026-01-23 10:31:18.320 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:31:18 compute-2 nova_compute[225701]: 2026-01-23 10:31:18.320 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.855s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:31:18 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:31:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:19 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/339495991' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:31:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:19 compute-2 nova_compute[225701]: 2026-01-23 10:31:19.320 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:19 compute-2 nova_compute[225701]: 2026-01-23 10:31:19.338 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:19 compute-2 nova_compute[225701]: 2026-01-23 10:31:19.528 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:19.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:19.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:19 compute-2 nova_compute[225701]: 2026-01-23 10:31:19.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:20 compute-2 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 23 10:31:20 compute-2 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 10:31:20 compute-2 ceph-mon[75771]: pgmap v1146: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:20 compute-2 nova_compute[225701]: 2026-01-23 10:31:20.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:20 compute-2 nova_compute[225701]: 2026-01-23 10:31:20.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:31:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:21 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:31:21 compute-2 ceph-mon[75771]: pgmap v1147: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:31:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:21.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:31:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:21.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:31:21 compute-2 nova_compute[225701]: 2026-01-23 10:31:21.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:22 compute-2 nova_compute[225701]: 2026-01-23 10:31:22.283 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:23.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:23.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:23 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:31:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:24 compute-2 ceph-mon[75771]: pgmap v1148: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:24 compute-2 nova_compute[225701]: 2026-01-23 10:31:24.530 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:25.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:25.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:26 compute-2 ceph-mon[75771]: pgmap v1149: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:27 compute-2 nova_compute[225701]: 2026-01-23 10:31:27.185 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:27 compute-2 nova_compute[225701]: 2026-01-23 10:31:27.185 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 23 10:31:27 compute-2 nova_compute[225701]: 2026-01-23 10:31:27.200 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 23 10:31:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:27 compute-2 nova_compute[225701]: 2026-01-23 10:31:27.289 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:27.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:27 compute-2 ceph-mon[75771]: pgmap v1150: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:31:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:27.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:27 compute-2 sudo[246084]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:31:27 compute-2 sudo[246084]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:31:27 compute-2 sudo[246084]: pam_unix(sudo:session): session closed for user root
Jan 23 10:31:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:28 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:31:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:29 compute-2 ceph-mon[75771]: pgmap v1151: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:29.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:29 compute-2 nova_compute[225701]: 2026-01-23 10:31:29.584 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:31:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:29.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:31:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:31.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:31.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:31 compute-2 ceph-mon[75771]: pgmap v1152: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:31:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:32 compute-2 nova_compute[225701]: 2026-01-23 10:31:32.293 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:33.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:31:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:33.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:31:33 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:31:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:34 compute-2 ceph-mon[75771]: pgmap v1153: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:34 compute-2 nova_compute[225701]: 2026-01-23 10:31:34.589 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:31:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:31:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:35.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:31:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:35.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:36 compute-2 ceph-mon[75771]: pgmap v1154: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:36 compute-2 podman[246118]: 2026-01-23 10:31:36.449819294 +0000 UTC m=+0.058704604 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 23 10:31:36 compute-2 podman[246117]: 2026-01-23 10:31:36.488714326 +0000 UTC m=+0.099579715 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 10:31:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:37 compute-2 nova_compute[225701]: 2026-01-23 10:31:37.296 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:37 compute-2 ceph-mon[75771]: pgmap v1155: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:31:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:37.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:31:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:37.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:31:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:38 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:31:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:39.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:39 compute-2 nova_compute[225701]: 2026-01-23 10:31:39.593 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:31:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:39.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:31:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:40 compute-2 ceph-mon[75771]: pgmap v1156: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:41 compute-2 ceph-mon[75771]: pgmap v1157: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:31:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:41.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:41.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:42 compute-2 nova_compute[225701]: 2026-01-23 10:31:42.299 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:43.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:31:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:43.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:31:43 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:31:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:44 compute-2 ceph-mon[75771]: pgmap v1158: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:44 compute-2 nova_compute[225701]: 2026-01-23 10:31:44.594 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:45 compute-2 ceph-mon[75771]: pgmap v1159: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:31:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:45.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:31:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:45.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:47 compute-2 nova_compute[225701]: 2026-01-23 10:31:47.300 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:31:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:47.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:31:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:47.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:47 compute-2 sudo[246171]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:31:47 compute-2 sudo[246171]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:31:47 compute-2 sudo[246171]: pam_unix(sudo:session): session closed for user root
Jan 23 10:31:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:48 compute-2 ceph-mon[75771]: pgmap v1160: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:31:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:48 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:31:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/1552987279' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:31:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/1552987279' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:31:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:49 compute-2 nova_compute[225701]: 2026-01-23 10:31:49.596 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:49.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:49.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:50 compute-2 ceph-mon[75771]: pgmap v1161: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:31:50 compute-2 sudo[238657]: pam_unix(sudo:session): session closed for user root
Jan 23 10:31:50 compute-2 sshd-session[238656]: Received disconnect from 192.168.122.10 port 58248:11: disconnected by user
Jan 23 10:31:50 compute-2 sshd-session[238656]: Disconnected from user zuul 192.168.122.10 port 58248
Jan 23 10:31:50 compute-2 sshd-session[238653]: pam_unix(sshd:session): session closed for user zuul
Jan 23 10:31:50 compute-2 systemd-logind[786]: Session 55 logged out. Waiting for processes to exit.
Jan 23 10:31:50 compute-2 systemd[1]: session-55.scope: Deactivated successfully.
Jan 23 10:31:50 compute-2 systemd[1]: session-55.scope: Consumed 2min 57.559s CPU time, 747.7M memory peak, read 312.0M from disk, written 64.1M to disk.
Jan 23 10:31:50 compute-2 systemd-logind[786]: Removed session 55.
Jan 23 10:31:50 compute-2 sshd-session[246200]: Accepted publickey for zuul from 192.168.122.10 port 38738 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 10:31:50 compute-2 systemd-logind[786]: New session 56 of user zuul.
Jan 23 10:31:50 compute-2 systemd[1]: Started Session 56 of User zuul.
Jan 23 10:31:50 compute-2 sshd-session[246200]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 10:31:50 compute-2 nova_compute[225701]: 2026-01-23 10:31:50.678 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:50 compute-2 sudo[246204]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-2-2026-01-23-dpobane.tar.xz
Jan 23 10:31:50 compute-2 sudo[246204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:31:50 compute-2 sudo[246204]: pam_unix(sudo:session): session closed for user root
Jan 23 10:31:50 compute-2 sshd-session[246203]: Received disconnect from 192.168.122.10 port 38738:11: disconnected by user
Jan 23 10:31:50 compute-2 sshd-session[246203]: Disconnected from user zuul 192.168.122.10 port 38738
Jan 23 10:31:50 compute-2 sshd-session[246200]: pam_unix(sshd:session): session closed for user zuul
Jan 23 10:31:50 compute-2 systemd[1]: session-56.scope: Deactivated successfully.
Jan 23 10:31:50 compute-2 systemd-logind[786]: Session 56 logged out. Waiting for processes to exit.
Jan 23 10:31:50 compute-2 systemd-logind[786]: Removed session 56.
Jan 23 10:31:50 compute-2 sshd-session[246229]: Accepted publickey for zuul from 192.168.122.10 port 38744 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 10:31:50 compute-2 systemd-logind[786]: New session 57 of user zuul.
Jan 23 10:31:50 compute-2 systemd[1]: Started Session 57 of User zuul.
Jan 23 10:31:50 compute-2 sshd-session[246229]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 10:31:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:51 compute-2 sudo[246233]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Jan 23 10:31:51 compute-2 sudo[246233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:31:51 compute-2 sudo[246233]: pam_unix(sudo:session): session closed for user root
Jan 23 10:31:51 compute-2 sshd-session[246232]: Received disconnect from 192.168.122.10 port 38744:11: disconnected by user
Jan 23 10:31:51 compute-2 sshd-session[246232]: Disconnected from user zuul 192.168.122.10 port 38744
Jan 23 10:31:51 compute-2 sshd-session[246229]: pam_unix(sshd:session): session closed for user zuul
Jan 23 10:31:51 compute-2 systemd[1]: session-57.scope: Deactivated successfully.
Jan 23 10:31:51 compute-2 systemd-logind[786]: Session 57 logged out. Waiting for processes to exit.
Jan 23 10:31:51 compute-2 systemd-logind[786]: Removed session 57.
Jan 23 10:31:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:51 compute-2 ceph-mon[75771]: pgmap v1162: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:31:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:51.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:51.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:52 compute-2 nova_compute[225701]: 2026-01-23 10:31:52.302 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:53 compute-2 sudo[246260]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:31:53 compute-2 sudo[246260]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:31:53 compute-2 sudo[246260]: pam_unix(sudo:session): session closed for user root
Jan 23 10:31:53 compute-2 sudo[246285]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Jan 23 10:31:53 compute-2 sudo[246285]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:31:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:53.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:53.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:53 compute-2 sudo[246285]: pam_unix(sudo:session): session closed for user root
Jan 23 10:31:53 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:31:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:54 compute-2 sudo[246331]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:31:54 compute-2 sudo[246331]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:31:54 compute-2 sudo[246331]: pam_unix(sudo:session): session closed for user root
Jan 23 10:31:54 compute-2 sudo[246356]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:31:54 compute-2 sudo[246356]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:31:54 compute-2 ceph-mon[75771]: pgmap v1163: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:54 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:31:54 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:31:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:54 compute-2 sudo[246356]: pam_unix(sudo:session): session closed for user root
Jan 23 10:31:54 compute-2 nova_compute[225701]: 2026-01-23 10:31:54.640 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:55 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:31:55 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:31:55 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:31:55 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:31:55 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:31:55 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:31:55 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:31:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:31:55.501 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:31:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:31:55.502 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:31:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:31:55.502 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:31:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:55.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:31:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:55.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:31:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:56 compute-2 ceph-mon[75771]: pgmap v1164: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 534 B/s rd, 0 op/s
Jan 23 10:31:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:57 compute-2 nova_compute[225701]: 2026-01-23 10:31:57.305 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:57.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:57.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:58 compute-2 ceph-mon[75771]: pgmap v1165: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 801 B/s rd, 0 op/s
Jan 23 10:31:58 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:31:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:31:59 compute-2 ceph-mon[75771]: pgmap v1166: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 534 B/s rd, 0 op/s
Jan 23 10:31:59 compute-2 sudo[246419]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:31:59 compute-2 sudo[246419]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:31:59 compute-2 sudo[246419]: pam_unix(sudo:session): session closed for user root
Jan 23 10:31:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:31:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:59.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:31:59 compute-2 nova_compute[225701]: 2026-01-23 10:31:59.640 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:31:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:59.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:00 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:32:00 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:32:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:01 compute-2 ceph-mon[75771]: pgmap v1167: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 534 B/s rd, 0 op/s
Jan 23 10:32:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:01.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:01.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:02 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:32:02 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 2400.1 total, 600.0 interval
                                           Cumulative writes: 9973 writes, 39K keys, 9973 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s
                                           Cumulative WAL: 9973 writes, 2660 syncs, 3.75 writes per sync, written: 0.03 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2178 writes, 7712 keys, 2178 commit groups, 1.0 writes per commit group, ingest: 7.91 MB, 0.01 MB/s
                                           Interval WAL: 2178 writes, 901 syncs, 2.42 writes per sync, written: 0.01 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 10:32:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:02 compute-2 nova_compute[225701]: 2026-01-23 10:32:02.308 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.341154) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164322341366, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 2850, "num_deletes": 506, "total_data_size": 6397733, "memory_usage": 6490536, "flush_reason": "Manual Compaction"}
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164322366528, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 2686587, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33630, "largest_seqno": 36474, "table_properties": {"data_size": 2676761, "index_size": 5040, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3717, "raw_key_size": 31915, "raw_average_key_size": 21, "raw_value_size": 2652031, "raw_average_value_size": 1807, "num_data_blocks": 215, "num_entries": 1467, "num_filter_entries": 1467, "num_deletions": 506, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164133, "oldest_key_time": 1769164133, "file_creation_time": 1769164322, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 25403 microseconds, and 8139 cpu microseconds.
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.366625) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 2686587 bytes OK
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.366653) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.369094) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.369113) EVENT_LOG_v1 {"time_micros": 1769164322369109, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.369129) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 6383313, prev total WAL file size 6383313, number of live WAL files 2.
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.370806) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031303034' seq:72057594037927935, type:22 .. '6D6772737461740031323536' seq:0, type:0; will stop at (end)
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(2623KB)], [63(13MB)]
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164322370945, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 17293732, "oldest_snapshot_seqno": -1}
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 6858 keys, 14416410 bytes, temperature: kUnknown
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164322495827, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 14416410, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14372124, "index_size": 26062, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17157, "raw_key_size": 176986, "raw_average_key_size": 25, "raw_value_size": 14250340, "raw_average_value_size": 2077, "num_data_blocks": 1044, "num_entries": 6858, "num_filter_entries": 6858, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769164322, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.496262) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 14416410 bytes
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.498681) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 138.3 rd, 115.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 13.9 +0.0 blob) out(13.7 +0.0 blob), read-write-amplify(11.8) write-amplify(5.4) OK, records in: 7802, records dropped: 944 output_compression: NoCompression
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.498712) EVENT_LOG_v1 {"time_micros": 1769164322498698, "job": 38, "event": "compaction_finished", "compaction_time_micros": 125013, "compaction_time_cpu_micros": 33641, "output_level": 6, "num_output_files": 1, "total_output_size": 14416410, "num_input_records": 7802, "num_output_records": 6858, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164322499813, "job": 38, "event": "table_file_deletion", "file_number": 65}
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164322504483, "job": 38, "event": "table_file_deletion", "file_number": 63}
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.370552) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.504593) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.504598) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.504600) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.504602) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:32:02 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.504609) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:32:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:03 compute-2 ceph-mon[75771]: pgmap v1168: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 534 B/s rd, 0 op/s
Jan 23 10:32:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:03.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:32:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:03.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:32:03 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:32:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:04 compute-2 nova_compute[225701]: 2026-01-23 10:32:04.642 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:32:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:05.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:32:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:32:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:05.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:32:05 compute-2 ceph-mon[75771]: pgmap v1169: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 534 B/s rd, 0 op/s
Jan 23 10:32:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:32:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:06 compute-2 podman[246453]: 2026-01-23 10:32:06.644213904 +0000 UTC m=+0.061719303 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 10:32:06 compute-2 podman[246452]: 2026-01-23 10:32:06.712137697 +0000 UTC m=+0.129981644 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 10:32:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:07 compute-2 nova_compute[225701]: 2026-01-23 10:32:07.309 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:07.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:32:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:07.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:32:07 compute-2 ceph-mon[75771]: pgmap v1170: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:32:07 compute-2 sudo[246493]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:32:07 compute-2 sudo[246493]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:32:07 compute-2 sudo[246493]: pam_unix(sudo:session): session closed for user root
Jan 23 10:32:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:08 compute-2 nova_compute[225701]: 2026-01-23 10:32:08.796 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:32:08 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:32:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:09.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:09 compute-2 nova_compute[225701]: 2026-01-23 10:32:09.645 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:09.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:09 compute-2 ceph-mon[75771]: pgmap v1171: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:32:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:10 compute-2 nova_compute[225701]: 2026-01-23 10:32:10.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:32:10 compute-2 nova_compute[225701]: 2026-01-23 10:32:10.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:32:10 compute-2 nova_compute[225701]: 2026-01-23 10:32:10.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:32:10 compute-2 nova_compute[225701]: 2026-01-23 10:32:10.803 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:32:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:11.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:11.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:11 compute-2 ceph-mon[75771]: pgmap v1172: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:32:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:12 compute-2 nova_compute[225701]: 2026-01-23 10:32:12.310 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:32:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:13.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:32:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:13.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:13 compute-2 ceph-mon[75771]: pgmap v1173: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:32:13 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:32:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:14 compute-2 nova_compute[225701]: 2026-01-23 10:32:14.647 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:15.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:15.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:15 compute-2 nova_compute[225701]: 2026-01-23 10:32:15.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:32:15 compute-2 ceph-mon[75771]: pgmap v1174: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:32:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:16 compute-2 nova_compute[225701]: 2026-01-23 10:32:16.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:32:16 compute-2 nova_compute[225701]: 2026-01-23 10:32:16.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:32:16 compute-2 nova_compute[225701]: 2026-01-23 10:32:16.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:32:16 compute-2 nova_compute[225701]: 2026-01-23 10:32:16.819 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:32:16 compute-2 nova_compute[225701]: 2026-01-23 10:32:16.820 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:32:16 compute-2 nova_compute[225701]: 2026-01-23 10:32:16.820 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:32:16 compute-2 nova_compute[225701]: 2026-01-23 10:32:16.820 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:32:16 compute-2 nova_compute[225701]: 2026-01-23 10:32:16.820 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:32:16 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3115399558' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:32:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:17 compute-2 nova_compute[225701]: 2026-01-23 10:32:17.312 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:17 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:32:17 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3707941147' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:32:17 compute-2 nova_compute[225701]: 2026-01-23 10:32:17.347 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:32:17 compute-2 nova_compute[225701]: 2026-01-23 10:32:17.504 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:32:17 compute-2 nova_compute[225701]: 2026-01-23 10:32:17.506 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4850MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:32:17 compute-2 nova_compute[225701]: 2026-01-23 10:32:17.506 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:32:17 compute-2 nova_compute[225701]: 2026-01-23 10:32:17.507 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:32:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:17.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:32:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:17.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:32:17 compute-2 ceph-mon[75771]: pgmap v1175: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:32:17 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3707941147' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:32:17 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1685129570' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:32:17 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3359406760' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.886895) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164337886953, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 400, "num_deletes": 251, "total_data_size": 456611, "memory_usage": 464200, "flush_reason": "Manual Compaction"}
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164337891440, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 298037, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36480, "largest_seqno": 36874, "table_properties": {"data_size": 295738, "index_size": 463, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 5740, "raw_average_key_size": 18, "raw_value_size": 291160, "raw_average_value_size": 948, "num_data_blocks": 20, "num_entries": 307, "num_filter_entries": 307, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164322, "oldest_key_time": 1769164322, "file_creation_time": 1769164337, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 4586 microseconds, and 2073 cpu microseconds.
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.891493) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 298037 bytes OK
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.891510) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.893780) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.893801) EVENT_LOG_v1 {"time_micros": 1769164337893795, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.893818) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 454028, prev total WAL file size 454028, number of live WAL files 2.
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.894361) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(291KB)], [66(13MB)]
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164337894405, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 14714447, "oldest_snapshot_seqno": -1}
Jan 23 10:32:17 compute-2 nova_compute[225701]: 2026-01-23 10:32:17.959 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:32:17 compute-2 nova_compute[225701]: 2026-01-23 10:32:17.959 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:32:17 compute-2 nova_compute[225701]: 2026-01-23 10:32:17.985 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 6655 keys, 12553762 bytes, temperature: kUnknown
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164337987140, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 12553762, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12512252, "index_size": 23798, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16645, "raw_key_size": 173472, "raw_average_key_size": 26, "raw_value_size": 12395417, "raw_average_value_size": 1862, "num_data_blocks": 943, "num_entries": 6655, "num_filter_entries": 6655, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769164337, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.987469) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 12553762 bytes
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.989239) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 158.4 rd, 135.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 13.7 +0.0 blob) out(12.0 +0.0 blob), read-write-amplify(91.5) write-amplify(42.1) OK, records in: 7165, records dropped: 510 output_compression: NoCompression
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.989281) EVENT_LOG_v1 {"time_micros": 1769164337989262, "job": 40, "event": "compaction_finished", "compaction_time_micros": 92873, "compaction_time_cpu_micros": 30914, "output_level": 6, "num_output_files": 1, "total_output_size": 12553762, "num_input_records": 7165, "num_output_records": 6655, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164337989578, "job": 40, "event": "table_file_deletion", "file_number": 68}
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164337996665, "job": 40, "event": "table_file_deletion", "file_number": 66}
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.894251) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.996781) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.996796) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.996798) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.996799) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:32:17 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.996800) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:32:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:18 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:32:18 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4119112767' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:32:18 compute-2 nova_compute[225701]: 2026-01-23 10:32:18.477 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:32:18 compute-2 nova_compute[225701]: 2026-01-23 10:32:18.484 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:32:18 compute-2 nova_compute[225701]: 2026-01-23 10:32:18.511 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:32:18 compute-2 nova_compute[225701]: 2026-01-23 10:32:18.513 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:32:18 compute-2 nova_compute[225701]: 2026-01-23 10:32:18.513 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:32:18 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:32:18 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/4119112767' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:32:18 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/204262838' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:32:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:19 compute-2 nova_compute[225701]: 2026-01-23 10:32:19.648 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:32:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:19.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:32:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:32:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:19.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:32:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:20 compute-2 ceph-mon[75771]: pgmap v1176: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:32:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:20 compute-2 nova_compute[225701]: 2026-01-23 10:32:20.513 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:32:20 compute-2 nova_compute[225701]: 2026-01-23 10:32:20.513 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:32:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:21 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:32:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:21.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:32:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:21.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:32:21 compute-2 nova_compute[225701]: 2026-01-23 10:32:21.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:32:21 compute-2 nova_compute[225701]: 2026-01-23 10:32:21.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:32:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:22 compute-2 ceph-mon[75771]: pgmap v1177: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:32:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:22 compute-2 nova_compute[225701]: 2026-01-23 10:32:22.316 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:32:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:23.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:32:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:23.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:23 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:32:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:24 compute-2 ceph-mon[75771]: pgmap v1178: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:32:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:24 compute-2 nova_compute[225701]: 2026-01-23 10:32:24.649 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:25 compute-2 ceph-mon[75771]: pgmap v1179: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:32:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:25.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:25.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:27 compute-2 nova_compute[225701]: 2026-01-23 10:32:27.319 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:32:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:27.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:32:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:27.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:27 compute-2 ceph-mon[75771]: pgmap v1180: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:32:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:28 compute-2 sudo[246582]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:32:28 compute-2 sudo[246582]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:32:28 compute-2 sudo[246582]: pam_unix(sudo:session): session closed for user root
Jan 23 10:32:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:29 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:32:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:29.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:29 compute-2 nova_compute[225701]: 2026-01-23 10:32:29.694 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:29.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:31 compute-2 ceph-mon[75771]: pgmap v1181: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:32:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:31.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:31.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:32 compute-2 nova_compute[225701]: 2026-01-23 10:32:32.321 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:32 compute-2 ceph-mon[75771]: pgmap v1182: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:32:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:32:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:33.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:32:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:33.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:34 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:32:34 compute-2 ceph-mon[75771]: pgmap v1183: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:32:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:34 compute-2 nova_compute[225701]: 2026-01-23 10:32:34.696 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:32:35 compute-2 radosgw[82185]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 23 10:32:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:32:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:35.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:32:35 compute-2 radosgw[82185]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 23 10:32:35 compute-2 radosgw[82185]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Jan 23 10:32:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:35.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:36 compute-2 ceph-mon[75771]: pgmap v1184: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:32:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:37 compute-2 nova_compute[225701]: 2026-01-23 10:32:37.324 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:37 compute-2 podman[246618]: 2026-01-23 10:32:37.632485729 +0000 UTC m=+0.048397265 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:32:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:32:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:37.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:32:37 compute-2 podman[246617]: 2026-01-23 10:32:37.684196836 +0000 UTC m=+0.101167018 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:32:37 compute-2 ceph-mon[75771]: pgmap v1185: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 12 KiB/s rd, 0 B/s wr, 19 op/s
Jan 23 10:32:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:32:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:37.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:32:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:39 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:32:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:39.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:39 compute-2 nova_compute[225701]: 2026-01-23 10:32:39.698 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:39.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:40 compute-2 ceph-mon[75771]: pgmap v1186: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 11 KiB/s rd, 0 B/s wr, 18 op/s
Jan 23 10:32:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:32:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:41.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:32:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:41.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:42 compute-2 nova_compute[225701]: 2026-01-23 10:32:42.370 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:42 compute-2 ceph-mon[75771]: pgmap v1187: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 11 KiB/s rd, 0 B/s wr, 18 op/s
Jan 23 10:32:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:32:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:43.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:32:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:43.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:44 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:32:44 compute-2 ceph-mon[75771]: pgmap v1188: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 41 KiB/s rd, 0 B/s wr, 68 op/s
Jan 23 10:32:44 compute-2 nova_compute[225701]: 2026-01-23 10:32:44.723 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:45.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:45.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:45 compute-2 ceph-mon[75771]: pgmap v1189: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 41 KiB/s rd, 0 B/s wr, 68 op/s
Jan 23 10:32:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:47 compute-2 nova_compute[225701]: 2026-01-23 10:32:47.372 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:47.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:32:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:47.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:32:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:48 compute-2 sudo[246669]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:32:48 compute-2 sudo[246669]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:32:48 compute-2 sudo[246669]: pam_unix(sudo:session): session closed for user root
Jan 23 10:32:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:48 compute-2 ceph-mon[75771]: pgmap v1190: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 89 KiB/s rd, 0 B/s wr, 147 op/s
Jan 23 10:32:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:49 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:32:49 compute-2 ceph-mon[75771]: pgmap v1191: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 78 KiB/s rd, 0 B/s wr, 129 op/s
Jan 23 10:32:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:49.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:49 compute-2 nova_compute[225701]: 2026-01-23 10:32:49.726 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:32:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:49.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:32:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:32:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:51.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:51 compute-2 ceph-mon[75771]: pgmap v1192: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 78 KiB/s rd, 0 B/s wr, 129 op/s
Jan 23 10:32:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:51.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:52 compute-2 nova_compute[225701]: 2026-01-23 10:32:52.375 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:32:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:53.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:32:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:32:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:53.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:32:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:54 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:32:54 compute-2 nova_compute[225701]: 2026-01-23 10:32:54.728 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:54 compute-2 ceph-mon[75771]: pgmap v1193: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 93 KiB/s rd, 0 B/s wr, 155 op/s
Jan 23 10:32:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:32:55.502 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:32:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:32:55.503 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:32:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:32:55.503 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:32:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:32:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:55.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:32:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:32:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:55.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:32:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:56 compute-2 ceph-mon[75771]: pgmap v1194: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 63 KiB/s rd, 0 B/s wr, 105 op/s
Jan 23 10:32:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:57 compute-2 nova_compute[225701]: 2026-01-23 10:32:57.378 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:57 compute-2 ceph-mon[75771]: pgmap v1195: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 64 KiB/s rd, 0 B/s wr, 105 op/s
Jan 23 10:32:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:57.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:57.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:32:59 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:32:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:59.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:59 compute-2 nova_compute[225701]: 2026-01-23 10:32:59.731 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:59 compute-2 sudo[246706]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:32:59 compute-2 sudo[246706]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:32:59 compute-2 sudo[246706]: pam_unix(sudo:session): session closed for user root
Jan 23 10:32:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:32:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:59.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:59 compute-2 sudo[246731]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:32:59 compute-2 sudo[246731]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:33:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:00 compute-2 ceph-mon[75771]: pgmap v1196: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 16 KiB/s rd, 0 B/s wr, 26 op/s
Jan 23 10:33:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:00 compute-2 sudo[246731]: pam_unix(sudo:session): session closed for user root
Jan 23 10:33:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:01 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:33:01 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:33:01 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:33:01 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:33:01 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:33:01 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:33:01 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:33:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:01.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:01.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:02 compute-2 ceph-mon[75771]: pgmap v1197: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 16 KiB/s rd, 0 B/s wr, 26 op/s
Jan 23 10:33:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:02 compute-2 nova_compute[225701]: 2026-01-23 10:33:02.380 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:03.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:03.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:04 compute-2 ceph-mon[75771]: pgmap v1198: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 0 B/s wr, 27 op/s
Jan 23 10:33:04 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:33:04 compute-2 nova_compute[225701]: 2026-01-23 10:33:04.734 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:05 compute-2 sudo[246793]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:33:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:05 compute-2 sudo[246793]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:33:05 compute-2 sudo[246793]: pam_unix(sudo:session): session closed for user root
Jan 23 10:33:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:05.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:05.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:05 compute-2 ceph-mon[75771]: pgmap v1199: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 522 B/s rd, 0 op/s
Jan 23 10:33:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:33:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:33:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:33:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:07 compute-2 nova_compute[225701]: 2026-01-23 10:33:07.382 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.463886) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164387463926, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 743, "num_deletes": 251, "total_data_size": 1499624, "memory_usage": 1527592, "flush_reason": "Manual Compaction"}
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164387471352, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 980518, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36880, "largest_seqno": 37617, "table_properties": {"data_size": 976921, "index_size": 1441, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 7221, "raw_average_key_size": 17, "raw_value_size": 969770, "raw_average_value_size": 2298, "num_data_blocks": 62, "num_entries": 422, "num_filter_entries": 422, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164337, "oldest_key_time": 1769164337, "file_creation_time": 1769164387, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 7520 microseconds, and 3402 cpu microseconds.
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.471406) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 980518 bytes OK
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.471424) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.473414) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.473428) EVENT_LOG_v1 {"time_micros": 1769164387473423, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.473445) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 1495708, prev total WAL file size 1495708, number of live WAL files 2.
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.474081) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600323531' seq:72057594037927935, type:22 .. '6B7600353033' seq:0, type:0; will stop at (end)
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(957KB)], [69(11MB)]
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164387474206, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 13534280, "oldest_snapshot_seqno": -1}
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 6561 keys, 12131420 bytes, temperature: kUnknown
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164387561925, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 12131420, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12090525, "index_size": 23375, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16453, "raw_key_size": 173192, "raw_average_key_size": 26, "raw_value_size": 11975107, "raw_average_value_size": 1825, "num_data_blocks": 913, "num_entries": 6561, "num_filter_entries": 6561, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769164387, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.562170) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 12131420 bytes
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.564130) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 154.1 rd, 138.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 12.0 +0.0 blob) out(11.6 +0.0 blob), read-write-amplify(26.2) write-amplify(12.4) OK, records in: 7077, records dropped: 516 output_compression: NoCompression
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.564147) EVENT_LOG_v1 {"time_micros": 1769164387564140, "job": 42, "event": "compaction_finished", "compaction_time_micros": 87813, "compaction_time_cpu_micros": 28686, "output_level": 6, "num_output_files": 1, "total_output_size": 12131420, "num_input_records": 7077, "num_output_records": 6561, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164387564468, "job": 42, "event": "table_file_deletion", "file_number": 71}
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164387566537, "job": 42, "event": "table_file_deletion", "file_number": 69}
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.473918) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.566641) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.566648) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.566649) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.566650) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:33:07 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.566652) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:33:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:07.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:07.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:08 compute-2 ceph-mon[75771]: pgmap v1200: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 783 B/s rd, 0 op/s
Jan 23 10:33:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:08 compute-2 sudo[246820]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:33:08 compute-2 sudo[246820]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:33:08 compute-2 sudo[246820]: pam_unix(sudo:session): session closed for user root
Jan 23 10:33:08 compute-2 podman[246845]: 2026-01-23 10:33:08.343675587 +0000 UTC m=+0.054919107 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 10:33:08 compute-2 podman[246844]: 2026-01-23 10:33:08.409318464 +0000 UTC m=+0.123844914 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 10:33:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:09 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:33:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:09.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:09 compute-2 nova_compute[225701]: 2026-01-23 10:33:09.764 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:09 compute-2 nova_compute[225701]: 2026-01-23 10:33:09.778 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:33:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:09.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:10 compute-2 ceph-mon[75771]: pgmap v1201: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 522 B/s rd, 0 op/s
Jan 23 10:33:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:11.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:33:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:11.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:33:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:12 compute-2 ceph-mon[75771]: pgmap v1202: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 522 B/s rd, 0 op/s
Jan 23 10:33:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:12 compute-2 nova_compute[225701]: 2026-01-23 10:33:12.385 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:12 compute-2 nova_compute[225701]: 2026-01-23 10:33:12.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:33:12 compute-2 nova_compute[225701]: 2026-01-23 10:33:12.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:33:12 compute-2 nova_compute[225701]: 2026-01-23 10:33:12.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:33:12 compute-2 nova_compute[225701]: 2026-01-23 10:33:12.806 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:33:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:13.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:13.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:14 compute-2 ceph-mon[75771]: pgmap v1203: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 764 B/s rd, 0 op/s
Jan 23 10:33:14 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:33:14 compute-2 nova_compute[225701]: 2026-01-23 10:33:14.768 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:15 compute-2 ceph-mon[75771]: pgmap v1204: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 509 B/s rd, 0 op/s
Jan 23 10:33:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:15.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:15.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:16 compute-2 nova_compute[225701]: 2026-01-23 10:33:16.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:33:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:17 compute-2 nova_compute[225701]: 2026-01-23 10:33:17.387 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:17 compute-2 ceph-mon[75771]: pgmap v1205: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 764 B/s rd, 0 op/s
Jan 23 10:33:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:17.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:17 compute-2 nova_compute[225701]: 2026-01-23 10:33:17.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:33:17 compute-2 nova_compute[225701]: 2026-01-23 10:33:17.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:33:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:33:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:17.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:33:18 compute-2 nova_compute[225701]: 2026-01-23 10:33:18.017 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:33:18 compute-2 nova_compute[225701]: 2026-01-23 10:33:18.017 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:33:18 compute-2 nova_compute[225701]: 2026-01-23 10:33:18.018 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:33:18 compute-2 nova_compute[225701]: 2026-01-23 10:33:18.018 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:33:18 compute-2 nova_compute[225701]: 2026-01-23 10:33:18.018 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:33:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:18 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:33:18 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1310870223' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:33:18 compute-2 nova_compute[225701]: 2026-01-23 10:33:18.504 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:33:18 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1310870223' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:33:18 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/944075048' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:33:18 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/4059943484' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:33:18 compute-2 nova_compute[225701]: 2026-01-23 10:33:18.657 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:33:18 compute-2 nova_compute[225701]: 2026-01-23 10:33:18.658 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4846MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:33:18 compute-2 nova_compute[225701]: 2026-01-23 10:33:18.658 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:33:18 compute-2 nova_compute[225701]: 2026-01-23 10:33:18.659 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:33:18 compute-2 nova_compute[225701]: 2026-01-23 10:33:18.781 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:33:18 compute-2 nova_compute[225701]: 2026-01-23 10:33:18.781 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:33:18 compute-2 nova_compute[225701]: 2026-01-23 10:33:18.866 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Refreshing inventories for resource provider db762d15-510c-4120-bfc4-afe76b90b657 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 23 10:33:18 compute-2 nova_compute[225701]: 2026-01-23 10:33:18.932 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Updating ProviderTree inventory for provider db762d15-510c-4120-bfc4-afe76b90b657 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 23 10:33:18 compute-2 nova_compute[225701]: 2026-01-23 10:33:18.933 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Updating inventory in ProviderTree for provider db762d15-510c-4120-bfc4-afe76b90b657 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 10:33:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:19 compute-2 nova_compute[225701]: 2026-01-23 10:33:19.153 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Refreshing aggregate associations for resource provider db762d15-510c-4120-bfc4-afe76b90b657, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 23 10:33:19 compute-2 nova_compute[225701]: 2026-01-23 10:33:19.180 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Refreshing trait associations for resource provider db762d15-510c-4120-bfc4-afe76b90b657, traits: COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX2,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_F16C,HW_CPU_X86_ABM,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_BMI2,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SVM,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_FDC,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_CLMUL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSSE3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 23 10:33:19 compute-2 nova_compute[225701]: 2026-01-23 10:33:19.229 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:33:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:19 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:33:19 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:33:19 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3715483574' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:33:19 compute-2 nova_compute[225701]: 2026-01-23 10:33:19.709 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:33:19 compute-2 nova_compute[225701]: 2026-01-23 10:33:19.715 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:33:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:33:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:19.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:33:19 compute-2 nova_compute[225701]: 2026-01-23 10:33:19.806 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:19 compute-2 ceph-mon[75771]: pgmap v1206: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 509 B/s rd, 0 op/s
Jan 23 10:33:19 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1727738214' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:33:19 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2868338412' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:33:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:33:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:19.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:33:19 compute-2 nova_compute[225701]: 2026-01-23 10:33:19.822 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:33:19 compute-2 nova_compute[225701]: 2026-01-23 10:33:19.824 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:33:19 compute-2 nova_compute[225701]: 2026-01-23 10:33:19.825 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:33:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:20 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3715483574' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:33:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:33:20 compute-2 nova_compute[225701]: 2026-01-23 10:33:20.825 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:33:20 compute-2 nova_compute[225701]: 2026-01-23 10:33:20.825 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:33:20 compute-2 nova_compute[225701]: 2026-01-23 10:33:20.825 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:33:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:21.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:21.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:21 compute-2 ceph-mon[75771]: pgmap v1207: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 509 B/s rd, 0 op/s
Jan 23 10:33:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:22 compute-2 nova_compute[225701]: 2026-01-23 10:33:22.389 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:22 compute-2 nova_compute[225701]: 2026-01-23 10:33:22.778 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:33:22 compute-2 nova_compute[225701]: 2026-01-23 10:33:22.796 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:33:22 compute-2 nova_compute[225701]: 2026-01-23 10:33:22.797 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:33:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:23.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:33:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:23.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:33:23 compute-2 ceph-mon[75771]: pgmap v1208: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 764 B/s rd, 0 op/s
Jan 23 10:33:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:24 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:33:24 compute-2 nova_compute[225701]: 2026-01-23 10:33:24.808 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:25.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:25.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:25 compute-2 ceph-mon[75771]: pgmap v1209: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:33:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:27 compute-2 nova_compute[225701]: 2026-01-23 10:33:27.391 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:27.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:33:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:27.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:33:27 compute-2 ceph-mon[75771]: pgmap v1210: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:33:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:28 compute-2 sudo[246951]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:33:28 compute-2 sudo[246951]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:33:28 compute-2 sudo[246951]: pam_unix(sudo:session): session closed for user root
Jan 23 10:33:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:29 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:33:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:29.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:29 compute-2 nova_compute[225701]: 2026-01-23 10:33:29.810 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:29.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:30 compute-2 ceph-mon[75771]: pgmap v1211: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:33:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:31.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:33:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:31.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:33:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:32 compute-2 ceph-mon[75771]: pgmap v1212: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:33:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:32 compute-2 nova_compute[225701]: 2026-01-23 10:33:32.394 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:33.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:33.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:34 compute-2 ceph-mon[75771]: pgmap v1213: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:33:34 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:33:34 compute-2 nova_compute[225701]: 2026-01-23 10:33:34.812 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:35 compute-2 ceph-mon[75771]: pgmap v1214: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:33:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:33:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:35.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:35.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:37 compute-2 nova_compute[225701]: 2026-01-23 10:33:37.396 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:37 compute-2 ceph-mon[75771]: pgmap v1215: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:33:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:37.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:33:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:37.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:33:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:38 compute-2 podman[246989]: 2026-01-23 10:33:38.682639138 +0000 UTC m=+0.095206423 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent)
Jan 23 10:33:38 compute-2 podman[246988]: 2026-01-23 10:33:38.688079271 +0000 UTC m=+0.108521799 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:33:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:39 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:33:39 compute-2 ceph-mon[75771]: pgmap v1216: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:33:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:39.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:39 compute-2 nova_compute[225701]: 2026-01-23 10:33:39.813 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:39.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:41.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:41 compute-2 ceph-mon[75771]: pgmap v1217: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:33:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:41.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:42 compute-2 nova_compute[225701]: 2026-01-23 10:33:42.398 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:43.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:33:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:43.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:33:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:44 compute-2 ceph-mon[75771]: pgmap v1218: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:33:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:44 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:33:44 compute-2 nova_compute[225701]: 2026-01-23 10:33:44.815 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:45.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:45.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:46 compute-2 ceph-mon[75771]: pgmap v1219: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:33:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:47 compute-2 nova_compute[225701]: 2026-01-23 10:33:47.400 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:33:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:47.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:33:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:33:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:47.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:33:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:48 compute-2 ceph-mon[75771]: pgmap v1220: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:33:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:48 compute-2 sudo[247043]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:33:48 compute-2 sudo[247043]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:33:48 compute-2 sudo[247043]: pam_unix(sudo:session): session closed for user root
Jan 23 10:33:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/2430308627' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:33:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/2430308627' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:33:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:49 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:33:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:49.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:49 compute-2 nova_compute[225701]: 2026-01-23 10:33:49.818 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:49.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:50 compute-2 ceph-mon[75771]: pgmap v1221: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:33:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:33:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:51.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:33:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:51.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:33:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:52 compute-2 ceph-mon[75771]: pgmap v1222: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:33:52 compute-2 nova_compute[225701]: 2026-01-23 10:33:52.403 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:53.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:53 compute-2 ceph-mon[75771]: pgmap v1223: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:33:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:53.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:54 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:33:54 compute-2 nova_compute[225701]: 2026-01-23 10:33:54.819 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:33:55.504 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:33:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:33:55.504 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:33:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:33:55.505 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:33:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:33:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:55.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:33:55 compute-2 ceph-mon[75771]: pgmap v1224: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:33:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:33:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:55.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:33:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:57 compute-2 nova_compute[225701]: 2026-01-23 10:33:57.405 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:57.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:33:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:57.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:33:57 compute-2 ceph-mon[75771]: pgmap v1225: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:33:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:33:59 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:33:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:59.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:59 compute-2 nova_compute[225701]: 2026-01-23 10:33:59.821 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:33:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:59.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:00 compute-2 ceph-mon[75771]: pgmap v1226: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:01 compute-2 ceph-mon[75771]: pgmap v1227: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:34:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:01.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:34:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:34:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:01.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:34:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:02 compute-2 nova_compute[225701]: 2026-01-23 10:34:02.407 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:34:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:03.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:34:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:03.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:03 compute-2 ceph-mon[75771]: pgmap v1228: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:34:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:04 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:34:04 compute-2 nova_compute[225701]: 2026-01-23 10:34:04.823 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:05 compute-2 sudo[247086]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:34:05 compute-2 sudo[247086]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:34:05 compute-2 sudo[247086]: pam_unix(sudo:session): session closed for user root
Jan 23 10:34:05 compute-2 sudo[247111]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:34:05 compute-2 sudo[247111]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:34:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:34:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:05.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:34:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:05.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:05 compute-2 sudo[247111]: pam_unix(sudo:session): session closed for user root
Jan 23 10:34:05 compute-2 ceph-mon[75771]: pgmap v1229: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:34:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:06 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:34:06 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:34:06 compute-2 ceph-mon[75771]: pgmap v1230: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 794 B/s rd, 0 op/s
Jan 23 10:34:06 compute-2 ceph-mon[75771]: pgmap v1231: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 639 B/s rd, 0 op/s
Jan 23 10:34:06 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:34:06 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:34:06 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:34:06 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:34:06 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:34:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:07 compute-2 nova_compute[225701]: 2026-01-23 10:34:07.420 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:07.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:34:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:07.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:34:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:08 compute-2 sudo[247170]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:34:08 compute-2 sudo[247170]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:34:08 compute-2 sudo[247170]: pam_unix(sudo:session): session closed for user root
Jan 23 10:34:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:09 compute-2 ceph-mon[75771]: pgmap v1232: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 639 B/s rd, 0 op/s
Jan 23 10:34:09 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:34:09 compute-2 podman[247197]: 2026-01-23 10:34:09.625673651 +0000 UTC m=+0.048605841 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 10:34:09 compute-2 podman[247196]: 2026-01-23 10:34:09.693510352 +0000 UTC m=+0.121271250 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:34:09 compute-2 nova_compute[225701]: 2026-01-23 10:34:09.825 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:34:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:09.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:34:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:09.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:10 compute-2 ceph-mon[75771]: pgmap v1233: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 639 B/s rd, 0 op/s
Jan 23 10:34:10 compute-2 nova_compute[225701]: 2026-01-23 10:34:10.797 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:34:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:11 compute-2 sudo[247241]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:34:11 compute-2 sudo[247241]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:34:11 compute-2 sudo[247241]: pam_unix(sudo:session): session closed for user root
Jan 23 10:34:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:11.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:11.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:12 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:34:12 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:34:12 compute-2 nova_compute[225701]: 2026-01-23 10:34:12.422 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:13 compute-2 ceph-mon[75771]: pgmap v1234: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 639 B/s rd, 0 op/s
Jan 23 10:34:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:13.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:13.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:14 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:34:14 compute-2 nova_compute[225701]: 2026-01-23 10:34:14.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:34:14 compute-2 nova_compute[225701]: 2026-01-23 10:34:14.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:34:14 compute-2 nova_compute[225701]: 2026-01-23 10:34:14.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:34:14 compute-2 nova_compute[225701]: 2026-01-23 10:34:14.801 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:34:14 compute-2 nova_compute[225701]: 2026-01-23 10:34:14.825 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:15 compute-2 ceph-mon[75771]: pgmap v1235: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 639 B/s rd, 0 op/s
Jan 23 10:34:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:15.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:15.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:16 compute-2 ceph-mon[75771]: pgmap v1236: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 614 B/s rd, 0 op/s
Jan 23 10:34:16 compute-2 nova_compute[225701]: 2026-01-23 10:34:16.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:34:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:17 compute-2 nova_compute[225701]: 2026-01-23 10:34:17.424 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:17 compute-2 nova_compute[225701]: 2026-01-23 10:34:17.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:34:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:34:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:17.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:34:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:17.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:19 compute-2 ceph-mon[75771]: pgmap v1237: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:19 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:34:19 compute-2 nova_compute[225701]: 2026-01-23 10:34:19.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:34:19 compute-2 nova_compute[225701]: 2026-01-23 10:34:19.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:34:19 compute-2 nova_compute[225701]: 2026-01-23 10:34:19.827 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:19.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:34:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:19.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:34:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:20 compute-2 nova_compute[225701]: 2026-01-23 10:34:20.069 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:34:20 compute-2 nova_compute[225701]: 2026-01-23 10:34:20.069 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:34:20 compute-2 nova_compute[225701]: 2026-01-23 10:34:20.069 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:34:20 compute-2 nova_compute[225701]: 2026-01-23 10:34:20.070 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:34:20 compute-2 nova_compute[225701]: 2026-01-23 10:34:20.070 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:34:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:20 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:34:20 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/505590307' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:34:20 compute-2 nova_compute[225701]: 2026-01-23 10:34:20.534 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:34:20 compute-2 nova_compute[225701]: 2026-01-23 10:34:20.702 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:34:20 compute-2 nova_compute[225701]: 2026-01-23 10:34:20.703 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4852MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:34:20 compute-2 nova_compute[225701]: 2026-01-23 10:34:20.703 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:34:20 compute-2 nova_compute[225701]: 2026-01-23 10:34:20.704 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:34:20 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1917450218' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:34:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:34:20 compute-2 ceph-mon[75771]: pgmap v1238: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:20 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/740635230' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:34:20 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/505590307' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:34:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:21 compute-2 nova_compute[225701]: 2026-01-23 10:34:21.259 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:34:21 compute-2 nova_compute[225701]: 2026-01-23 10:34:21.259 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:34:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:21 compute-2 nova_compute[225701]: 2026-01-23 10:34:21.279 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:34:21 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:34:21 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1417754147' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:34:21 compute-2 nova_compute[225701]: 2026-01-23 10:34:21.732 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:34:21 compute-2 nova_compute[225701]: 2026-01-23 10:34:21.739 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:34:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:21.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:21.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:21 compute-2 nova_compute[225701]: 2026-01-23 10:34:21.968 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:34:21 compute-2 nova_compute[225701]: 2026-01-23 10:34:21.969 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:34:21 compute-2 nova_compute[225701]: 2026-01-23 10:34:21.970 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.266s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:34:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:22 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1049699909' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:34:22 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1536309353' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:34:22 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1417754147' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:34:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:22 compute-2 nova_compute[225701]: 2026-01-23 10:34:22.426 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:22 compute-2 nova_compute[225701]: 2026-01-23 10:34:22.970 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:34:22 compute-2 nova_compute[225701]: 2026-01-23 10:34:22.970 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:34:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:23 compute-2 ceph-mon[75771]: pgmap v1239: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:34:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:23 compute-2 nova_compute[225701]: 2026-01-23 10:34:23.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:34:23 compute-2 nova_compute[225701]: 2026-01-23 10:34:23.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:34:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:23.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:23.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:24 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:34:24 compute-2 nova_compute[225701]: 2026-01-23 10:34:24.831 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:25 compute-2 ceph-mon[75771]: pgmap v1240: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:25.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:25.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:26 compute-2 ceph-mon[75771]: pgmap v1241: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:34:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:27 compute-2 nova_compute[225701]: 2026-01-23 10:34:27.428 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:34:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:27.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:34:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:34:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:27.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:34:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:28 compute-2 sudo[247328]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:34:28 compute-2 sudo[247328]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:34:28 compute-2 sudo[247328]: pam_unix(sudo:session): session closed for user root
Jan 23 10:34:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:29 compute-2 ceph-mon[75771]: pgmap v1242: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:29 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:34:29 compute-2 nova_compute[225701]: 2026-01-23 10:34:29.832 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:29.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:29.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:30 compute-2 ceph-mon[75771]: pgmap v1243: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:31.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:34:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:31.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:34:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:32 compute-2 nova_compute[225701]: 2026-01-23 10:34:32.430 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:32 compute-2 ceph-mon[75771]: pgmap v1244: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:34:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:34:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:33.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:34:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:33.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:34 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:34:34 compute-2 nova_compute[225701]: 2026-01-23 10:34:34.836 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:35 compute-2 ceph-mon[75771]: pgmap v1245: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:34:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:35.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:34:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:35.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:36 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:34:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:37 compute-2 ceph-mon[75771]: pgmap v1246: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:34:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:37 compute-2 nova_compute[225701]: 2026-01-23 10:34:37.432 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:37.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:37.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:39 compute-2 ceph-mon[75771]: pgmap v1247: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:39 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:34:39 compute-2 nova_compute[225701]: 2026-01-23 10:34:39.836 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:39.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:34:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:39.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:34:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:40 compute-2 ceph-mon[75771]: pgmap v1248: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:40 compute-2 podman[247366]: 2026-01-23 10:34:40.623133771 +0000 UTC m=+0.047938565 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 23 10:34:40 compute-2 podman[247365]: 2026-01-23 10:34:40.65371343 +0000 UTC m=+0.078839123 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 10:34:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:34:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:41.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:34:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:41.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:42 compute-2 nova_compute[225701]: 2026-01-23 10:34:42.435 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:43 compute-2 ceph-mon[75771]: pgmap v1249: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:34:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:43.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:34:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:43.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:34:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:44 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:34:44 compute-2 nova_compute[225701]: 2026-01-23 10:34:44.837 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:45 compute-2 ceph-mon[75771]: pgmap v1250: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:45.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:45.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:47 compute-2 ceph-mon[75771]: pgmap v1251: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:34:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:47 compute-2 nova_compute[225701]: 2026-01-23 10:34:47.436 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:47.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:47.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:48 compute-2 sudo[247416]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:34:48 compute-2 sudo[247416]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:34:48 compute-2 sudo[247416]: pam_unix(sudo:session): session closed for user root
Jan 23 10:34:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:49 compute-2 ceph-mon[75771]: pgmap v1252: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/4134536473' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:34:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/4134536473' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:34:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:49 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:34:49 compute-2 nova_compute[225701]: 2026-01-23 10:34:49.840 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:49.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:49.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:34:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:51 compute-2 ceph-mon[75771]: pgmap v1253: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:51.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:51.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:52 compute-2 nova_compute[225701]: 2026-01-23 10:34:52.438 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:53 compute-2 ceph-mon[75771]: pgmap v1254: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:34:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:34:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:53.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:34:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.002000047s ======
Jan 23 10:34:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:53.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000047s
Jan 23 10:34:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:54 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:34:54 compute-2 nova_compute[225701]: 2026-01-23 10:34:54.841 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:55 compute-2 ceph-mon[75771]: pgmap v1255: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:34:55.505 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:34:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:34:55.506 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:34:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:34:55.506 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:34:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:55.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:34:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:55.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:34:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:57 compute-2 nova_compute[225701]: 2026-01-23 10:34:57.441 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:57 compute-2 ceph-mon[75771]: pgmap v1256: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:34:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:57.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:57.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:58 compute-2 ceph-mon[75771]: pgmap v1257: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:34:59 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:34:59 compute-2 nova_compute[225701]: 2026-01-23 10:34:59.843 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:34:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:59.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:34:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:34:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:59.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:01 compute-2 ceph-mon[75771]: pgmap v1258: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:01.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:01.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:02 compute-2 nova_compute[225701]: 2026-01-23 10:35:02.444 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:02 compute-2 ceph-mon[75771]: pgmap v1259: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:35:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:03.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:03.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:04 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:35:04 compute-2 ceph-mon[75771]: pgmap v1260: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:04 compute-2 nova_compute[225701]: 2026-01-23 10:35:04.846 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:35:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:05.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:05 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:05 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:05 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:05.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:06 compute-2 ceph-mon[75771]: pgmap v1261: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:35:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:07 compute-2 nova_compute[225701]: 2026-01-23 10:35:07.446 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:07.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:07 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:07 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:07 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:07.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:08 compute-2 sudo[247461]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:35:08 compute-2 sudo[247461]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:35:08 compute-2 sudo[247461]: pam_unix(sudo:session): session closed for user root
Jan 23 10:35:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:09 compute-2 ceph-mon[75771]: pgmap v1262: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:09 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:35:09 compute-2 nova_compute[225701]: 2026-01-23 10:35:09.847 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:09.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:09 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:09 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:09 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:09.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:10 compute-2 ceph-mon[75771]: pgmap v1263: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:11 compute-2 sudo[247489]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:35:11 compute-2 sudo[247489]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:35:11 compute-2 sudo[247489]: pam_unix(sudo:session): session closed for user root
Jan 23 10:35:11 compute-2 podman[247490]: 2026-01-23 10:35:11.62705722 +0000 UTC m=+0.047775161 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 23 10:35:11 compute-2 podman[247488]: 2026-01-23 10:35:11.650814852 +0000 UTC m=+0.077643702 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 23 10:35:11 compute-2 sudo[247545]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 23 10:35:11 compute-2 sudo[247545]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:35:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:11.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:11 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:11 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:35:11 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:11.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:35:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:12 compute-2 podman[247652]: 2026-01-23 10:35:12.183158358 +0000 UTC m=+0.069337958 container exec 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Jan 23 10:35:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:12 compute-2 podman[247652]: 2026-01-23 10:35:12.310157499 +0000 UTC m=+0.196337069 container exec_died 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 10:35:12 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 10:35:12 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 10:35:12 compute-2 nova_compute[225701]: 2026-01-23 10:35:12.448 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:12 compute-2 podman[247756]: 2026-01-23 10:35:12.634808379 +0000 UTC m=+0.050316913 container exec 610246eaa4f552466bc8eee8cc806d516dfcdb85055d982f77b2276e24631ff0 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 10:35:12 compute-2 podman[247756]: 2026-01-23 10:35:12.650279948 +0000 UTC m=+0.065788492 container exec_died 610246eaa4f552466bc8eee8cc806d516dfcdb85055d982f77b2276e24631ff0 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 10:35:12 compute-2 nova_compute[225701]: 2026-01-23 10:35:12.778 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:35:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:13 compute-2 podman[247910]: 2026-01-23 10:35:13.232005515 +0000 UTC m=+0.050396325 container exec c9fabe49ecac94420d698684f7f0ce6311c7531d91804b2e676e15edf98f3e26 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj)
Jan 23 10:35:13 compute-2 podman[247910]: 2026-01-23 10:35:13.240432061 +0000 UTC m=+0.058822831 container exec_died c9fabe49ecac94420d698684f7f0ce6311c7531d91804b2e676e15edf98f3e26 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj)
Jan 23 10:35:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:13 compute-2 ceph-mon[75771]: pgmap v1264: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:35:13 compute-2 podman[247975]: 2026-01-23 10:35:13.419794314 +0000 UTC m=+0.047454204 container exec 5f2fa996bf6c111f1884c2f59ff9b058d96f90c6e8935be1b7683b38269ec7aa (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai, version=2.2.4, vendor=Red Hat, Inc., distribution-scope=public, name=keepalived, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, release=1793, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Jan 23 10:35:13 compute-2 podman[247975]: 2026-01-23 10:35:13.43394857 +0000 UTC m=+0.061608440 container exec_died 5f2fa996bf6c111f1884c2f59ff9b058d96f90c6e8935be1b7683b38269ec7aa (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., release=1793, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, version=2.2.4, com.redhat.component=keepalived-container, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived)
Jan 23 10:35:13 compute-2 sudo[247545]: pam_unix(sudo:session): session closed for user root
Jan 23 10:35:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:13.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:13 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:13 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:13 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:13.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:14 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:35:14 compute-2 sudo[248041]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:35:14 compute-2 sudo[248041]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:35:14 compute-2 sudo[248041]: pam_unix(sudo:session): session closed for user root
Jan 23 10:35:14 compute-2 nova_compute[225701]: 2026-01-23 10:35:14.848 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:14 compute-2 sudo[248066]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:35:14 compute-2 sudo[248066]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:35:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:15 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:35:15 compute-2 ceph-mon[75771]: pgmap v1265: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:15 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:35:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:15 compute-2 sudo[248066]: pam_unix(sudo:session): session closed for user root
Jan 23 10:35:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:15.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:15 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:15 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:35:15 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:15.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:35:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:16 compute-2 nova_compute[225701]: 2026-01-23 10:35:16.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:35:16 compute-2 nova_compute[225701]: 2026-01-23 10:35:16.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:35:16 compute-2 nova_compute[225701]: 2026-01-23 10:35:16.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:35:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:17 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 10:35:17 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:35:17 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:35:17 compute-2 ceph-mon[75771]: pgmap v1266: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 546 B/s rd, 0 op/s
Jan 23 10:35:17 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:35:17 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:35:17 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:35:17 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:35:17 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:35:17 compute-2 nova_compute[225701]: 2026-01-23 10:35:17.204 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:35:17 compute-2 nova_compute[225701]: 2026-01-23 10:35:17.205 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:35:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:17 compute-2 nova_compute[225701]: 2026-01-23 10:35:17.490 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:17.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:17 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:17 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:17 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:17.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:18 compute-2 ceph-mon[75771]: pgmap v1267: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 546 B/s rd, 0 op/s
Jan 23 10:35:18 compute-2 nova_compute[225701]: 2026-01-23 10:35:18.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:35:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:19 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:35:19 compute-2 nova_compute[225701]: 2026-01-23 10:35:19.850 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:35:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:19.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:35:19 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:19 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:19 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:19.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:20 compute-2 ceph-mon[75771]: pgmap v1268: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 546 B/s rd, 0 op/s
Jan 23 10:35:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:35:20 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/905573868' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:35:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:21 compute-2 nova_compute[225701]: 2026-01-23 10:35:21.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:35:21 compute-2 nova_compute[225701]: 2026-01-23 10:35:21.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:35:21 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/573725531' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:35:21 compute-2 nova_compute[225701]: 2026-01-23 10:35:21.810 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:35:21 compute-2 nova_compute[225701]: 2026-01-23 10:35:21.811 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:35:21 compute-2 nova_compute[225701]: 2026-01-23 10:35:21.811 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:35:21 compute-2 nova_compute[225701]: 2026-01-23 10:35:21.811 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:35:21 compute-2 nova_compute[225701]: 2026-01-23 10:35:21.812 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:35:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:21.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:21 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:21 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:21 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:21.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:22 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:35:22 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3397794839' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:35:22 compute-2 nova_compute[225701]: 2026-01-23 10:35:22.265 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:35:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:22 compute-2 nova_compute[225701]: 2026-01-23 10:35:22.419 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:35:22 compute-2 nova_compute[225701]: 2026-01-23 10:35:22.420 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4791MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:35:22 compute-2 nova_compute[225701]: 2026-01-23 10:35:22.420 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:35:22 compute-2 nova_compute[225701]: 2026-01-23 10:35:22.420 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:35:22 compute-2 nova_compute[225701]: 2026-01-23 10:35:22.493 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:22 compute-2 nova_compute[225701]: 2026-01-23 10:35:22.586 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:35:22 compute-2 nova_compute[225701]: 2026-01-23 10:35:22.587 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:35:22 compute-2 nova_compute[225701]: 2026-01-23 10:35:22.660 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:35:23 compute-2 ceph-mon[75771]: pgmap v1269: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 819 B/s rd, 0 op/s
Jan 23 10:35:23 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3397794839' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:35:23 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1836247010' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:35:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:23 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:35:23 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/342726147' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:35:23 compute-2 nova_compute[225701]: 2026-01-23 10:35:23.111 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:35:23 compute-2 nova_compute[225701]: 2026-01-23 10:35:23.116 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:35:23 compute-2 nova_compute[225701]: 2026-01-23 10:35:23.191 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:35:23 compute-2 nova_compute[225701]: 2026-01-23 10:35:23.193 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:35:23 compute-2 nova_compute[225701]: 2026-01-23 10:35:23.193 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.773s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:35:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:23 compute-2 sudo[248174]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:35:23 compute-2 sudo[248174]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:35:23 compute-2 sudo[248174]: pam_unix(sudo:session): session closed for user root
Jan 23 10:35:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:23.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:23 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:23 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:23 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:23.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:24 compute-2 nova_compute[225701]: 2026-01-23 10:35:24.193 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:35:24 compute-2 nova_compute[225701]: 2026-01-23 10:35:24.194 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:35:24 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:35:24 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/342726147' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:35:24 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:35:24 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3852357449' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:35:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:24 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:35:24 compute-2 nova_compute[225701]: 2026-01-23 10:35:24.852 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:25 compute-2 ceph-mon[75771]: pgmap v1270: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 546 B/s rd, 0 op/s
Jan 23 10:35:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:25 compute-2 nova_compute[225701]: 2026-01-23 10:35:25.778 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:35:25 compute-2 nova_compute[225701]: 2026-01-23 10:35:25.856 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:35:25 compute-2 nova_compute[225701]: 2026-01-23 10:35:25.856 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:35:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:25.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:25 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:25 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:25 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:25.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:26 compute-2 ceph-mon[75771]: pgmap v1271: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 546 B/s rd, 0 op/s
Jan 23 10:35:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:27 compute-2 nova_compute[225701]: 2026-01-23 10:35:27.536 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:35:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:27.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:35:27 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:27 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:27 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:27.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:29 compute-2 sudo[248205]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:35:29 compute-2 sudo[248205]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:35:29 compute-2 sudo[248205]: pam_unix(sudo:session): session closed for user root
Jan 23 10:35:29 compute-2 ceph-mon[75771]: pgmap v1272: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:35:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:29 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:35:29 compute-2 nova_compute[225701]: 2026-01-23 10:35:29.720 225706 DEBUG oslo_concurrency.processutils [None req-a8510dbf-d677-4163-b681-0279df98cd8c 00aca23f964f49a5a9abfea9744e871b 5220cd4f58cb43bb899e367e961bc5c1 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:35:29 compute-2 nova_compute[225701]: 2026-01-23 10:35:29.754 225706 DEBUG oslo_concurrency.processutils [None req-a8510dbf-d677-4163-b681-0279df98cd8c 00aca23f964f49a5a9abfea9744e871b 5220cd4f58cb43bb899e367e961bc5c1 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:35:29 compute-2 nova_compute[225701]: 2026-01-23 10:35:29.854 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:29.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:29 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:29 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:29 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:29.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:31 compute-2 ceph-mon[75771]: pgmap v1273: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:31.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:31 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:31 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:31 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:31.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:32 compute-2 nova_compute[225701]: 2026-01-23 10:35:32.538 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:32 compute-2 ceph-mon[75771]: pgmap v1274: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:35:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:33.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:33 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:33 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:33 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:33.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:34 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:35:34 compute-2 nova_compute[225701]: 2026-01-23 10:35:34.857 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:34 compute-2 ceph-mon[75771]: pgmap v1275: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:35 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:35:35.630 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:35:35 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:35:35.631 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:35:35 compute-2 nova_compute[225701]: 2026-01-23 10:35:35.673 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:35.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:35 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:35 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:35 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:35.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:36 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:35:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:37 compute-2 ceph-mon[75771]: pgmap v1276: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:37 compute-2 nova_compute[225701]: 2026-01-23 10:35:37.541 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:37.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:37 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:37 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:35:37 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:37.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:35:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:38 compute-2 ceph-mon[75771]: pgmap v1277: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:35:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:39 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:35:39 compute-2 nova_compute[225701]: 2026-01-23 10:35:39.859 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:35:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:39.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:35:39 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:39 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:39 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:39.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:40 compute-2 ceph-mon[75771]: pgmap v1278: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:41.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:41 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:41 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:41 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:41.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:42 compute-2 nova_compute[225701]: 2026-01-23 10:35:42.544 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:42 compute-2 podman[248246]: 2026-01-23 10:35:42.63270285 +0000 UTC m=+0.051553853 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 23 10:35:42 compute-2 podman[248245]: 2026-01-23 10:35:42.667768339 +0000 UTC m=+0.086776606 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 10:35:42 compute-2 ceph-mon[75771]: pgmap v1279: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:35:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:35:43 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:43.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:35:43 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c1ba5d0 =====
Jan 23 10:35:43 compute-2 radosgw[82185]: ====== req done req=0x7f821c1ba5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:43 compute-2 radosgw[82185]: beast: 0x7f821c1ba5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:43.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:44 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:35:44.634 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:35:44 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:35:44 compute-2 ceph-mon[75771]: pgmap v1280: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:44 compute-2 nova_compute[225701]: 2026-01-23 10:35:44.860 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:45.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:45 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:45 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:45 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:45.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:47 compute-2 ceph-mon[75771]: pgmap v1281: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:47 compute-2 nova_compute[225701]: 2026-01-23 10:35:47.546 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:47.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:47 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:47 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:47 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:47.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:49 compute-2 sudo[248293]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:35:49 compute-2 sudo[248293]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:35:49 compute-2 sudo[248293]: pam_unix(sudo:session): session closed for user root
Jan 23 10:35:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:49 compute-2 ceph-mon[75771]: pgmap v1282: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:35:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/1762354554' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:35:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/1762354554' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:35:49 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:35:49 compute-2 nova_compute[225701]: 2026-01-23 10:35:49.862 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:49.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:49 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:49 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:49 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:49.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:50 compute-2 ceph-mon[75771]: pgmap v1283: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:35:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:51.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:51 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:51 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:35:51 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:51.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:35:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:52 compute-2 nova_compute[225701]: 2026-01-23 10:35:52.548 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:53 compute-2 ceph-mon[75771]: pgmap v1284: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:35:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:53.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:53 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:53 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:53 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:53.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:54 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:35:54 compute-2 ceph-mon[75771]: pgmap v1285: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:54 compute-2 nova_compute[225701]: 2026-01-23 10:35:54.865 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:54 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Jan 23 10:35:54 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:54.934602) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:35:54 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Jan 23 10:35:54 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164554934873, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 1870, "num_deletes": 251, "total_data_size": 4935695, "memory_usage": 5020648, "flush_reason": "Manual Compaction"}
Jan 23 10:35:54 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Jan 23 10:35:54 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164554957333, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 3204080, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 37622, "largest_seqno": 39487, "table_properties": {"data_size": 3196224, "index_size": 4735, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16131, "raw_average_key_size": 20, "raw_value_size": 3180644, "raw_average_value_size": 3995, "num_data_blocks": 201, "num_entries": 796, "num_filter_entries": 796, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164388, "oldest_key_time": 1769164388, "file_creation_time": 1769164554, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:35:54 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 22743 microseconds, and 8166 cpu microseconds.
Jan 23 10:35:54 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:35:54 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:54.957419) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 3204080 bytes OK
Jan 23 10:35:54 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:54.957450) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Jan 23 10:35:54 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:54.959004) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Jan 23 10:35:54 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:54.959024) EVENT_LOG_v1 {"time_micros": 1769164554959020, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:35:54 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:54.959046) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:35:54 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 4927370, prev total WAL file size 4927370, number of live WAL files 2.
Jan 23 10:35:54 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:35:54 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:54.960562) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033303132' seq:72057594037927935, type:22 .. '7061786F730033323634' seq:0, type:0; will stop at (end)
Jan 23 10:35:54 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:35:54 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(3128KB)], [72(11MB)]
Jan 23 10:35:54 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164554960706, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 15335500, "oldest_snapshot_seqno": -1}
Jan 23 10:35:55 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 6839 keys, 13114719 bytes, temperature: kUnknown
Jan 23 10:35:55 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164555055191, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 13114719, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13071128, "index_size": 25367, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17157, "raw_key_size": 179703, "raw_average_key_size": 26, "raw_value_size": 12949802, "raw_average_value_size": 1893, "num_data_blocks": 991, "num_entries": 6839, "num_filter_entries": 6839, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769164554, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:35:55 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:35:55 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:55.055557) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 13114719 bytes
Jan 23 10:35:55 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:55.057209) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 162.1 rd, 138.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.1, 11.6 +0.0 blob) out(12.5 +0.0 blob), read-write-amplify(8.9) write-amplify(4.1) OK, records in: 7357, records dropped: 518 output_compression: NoCompression
Jan 23 10:35:55 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:55.057228) EVENT_LOG_v1 {"time_micros": 1769164555057218, "job": 44, "event": "compaction_finished", "compaction_time_micros": 94610, "compaction_time_cpu_micros": 28685, "output_level": 6, "num_output_files": 1, "total_output_size": 13114719, "num_input_records": 7357, "num_output_records": 6839, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:35:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:55 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:35:55 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164555057968, "job": 44, "event": "table_file_deletion", "file_number": 74}
Jan 23 10:35:55 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:35:55 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164555060238, "job": 44, "event": "table_file_deletion", "file_number": 72}
Jan 23 10:35:55 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:54.960370) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:35:55 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:55.060274) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:35:55 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:55.060278) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:35:55 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:55.060280) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:35:55 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:55.060281) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:35:55 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:55.060317) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:35:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:35:55.507 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:35:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:35:55.507 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:35:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:35:55.507 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:35:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:35:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:55.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:35:55 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:55 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:55 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:55.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:57 compute-2 ceph-mon[75771]: pgmap v1286: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:57 compute-2 nova_compute[225701]: 2026-01-23 10:35:57.593 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:35:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:57.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:35:57 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:57 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:35:57 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:57.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:35:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:58 compute-2 ceph-mon[75771]: pgmap v1287: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:35:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:35:59 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:35:59 compute-2 nova_compute[225701]: 2026-01-23 10:35:59.914 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:59.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:59 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:35:59 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:59 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:59.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:01 compute-2 ceph-mon[75771]: pgmap v1288: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:36:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:01.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:36:01 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:01 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:36:01 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:01.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:36:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:02 compute-2 nova_compute[225701]: 2026-01-23 10:36:02.595 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:02 compute-2 ceph-mon[75771]: pgmap v1289: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:36:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:03 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:03 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:36:03 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:03.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:36:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:03.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:04 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:36:04 compute-2 nova_compute[225701]: 2026-01-23 10:36:04.916 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:05 compute-2 ceph-mon[75771]: pgmap v1290: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:36:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:05.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:06.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:06 compute-2 ceph-mon[75771]: pgmap v1291: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:07 compute-2 nova_compute[225701]: 2026-01-23 10:36:07.599 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:08.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:08.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:09 compute-2 ceph-mon[75771]: pgmap v1292: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:36:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:09 compute-2 sudo[248338]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:36:09 compute-2 sudo[248338]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:36:09 compute-2 sudo[248338]: pam_unix(sudo:session): session closed for user root
Jan 23 10:36:09 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:36:09 compute-2 nova_compute[225701]: 2026-01-23 10:36:09.939 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:10.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:10.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:10 compute-2 nova_compute[225701]: 2026-01-23 10:36:10.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:36:10 compute-2 nova_compute[225701]: 2026-01-23 10:36:10.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 23 10:36:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:11 compute-2 ceph-mon[75771]: pgmap v1293: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:12.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:12.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:12 compute-2 nova_compute[225701]: 2026-01-23 10:36:12.602 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:12 compute-2 ceph-mon[75771]: pgmap v1294: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:36:12 compute-2 nova_compute[225701]: 2026-01-23 10:36:12.972 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:36:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:13 compute-2 podman[248367]: 2026-01-23 10:36:13.662806705 +0000 UTC m=+0.082275536 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 23 10:36:13 compute-2 podman[248368]: 2026-01-23 10:36:13.663846021 +0000 UTC m=+0.083125977 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 10:36:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:14.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:14.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:14 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:36:14 compute-2 ceph-mon[75771]: pgmap v1295: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:14 compute-2 nova_compute[225701]: 2026-01-23 10:36:14.942 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:16.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:36:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:16.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:36:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:16 compute-2 nova_compute[225701]: 2026-01-23 10:36:16.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:36:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:17 compute-2 ceph-mon[75771]: pgmap v1296: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:17 compute-2 nova_compute[225701]: 2026-01-23 10:36:17.604 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:17 compute-2 nova_compute[225701]: 2026-01-23 10:36:17.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:36:17 compute-2 nova_compute[225701]: 2026-01-23 10:36:17.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:36:17 compute-2 nova_compute[225701]: 2026-01-23 10:36:17.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:36:17 compute-2 nova_compute[225701]: 2026-01-23 10:36:17.929 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:36:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:18.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:18.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:19 compute-2 ceph-mon[75771]: pgmap v1297: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:36:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:19 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:36:19 compute-2 nova_compute[225701]: 2026-01-23 10:36:19.944 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:20.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:20.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:36:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:20 compute-2 nova_compute[225701]: 2026-01-23 10:36:20.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:36:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:21 compute-2 ceph-mon[75771]: pgmap v1298: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:21 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1725171694' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:36:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:22.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:22.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:22 compute-2 nova_compute[225701]: 2026-01-23 10:36:22.606 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:22 compute-2 nova_compute[225701]: 2026-01-23 10:36:22.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:36:22 compute-2 nova_compute[225701]: 2026-01-23 10:36:22.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:36:22 compute-2 ceph-mon[75771]: pgmap v1299: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:36:22 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1801997280' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:36:22 compute-2 nova_compute[225701]: 2026-01-23 10:36:22.918 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:36:22 compute-2 nova_compute[225701]: 2026-01-23 10:36:22.919 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:36:22 compute-2 nova_compute[225701]: 2026-01-23 10:36:22.919 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:36:22 compute-2 nova_compute[225701]: 2026-01-23 10:36:22.919 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:36:22 compute-2 nova_compute[225701]: 2026-01-23 10:36:22.920 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:36:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:23 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:36:23 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4289693492' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:36:23 compute-2 nova_compute[225701]: 2026-01-23 10:36:23.358 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:36:23 compute-2 sudo[248444]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:36:23 compute-2 sudo[248444]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:36:23 compute-2 sudo[248444]: pam_unix(sudo:session): session closed for user root
Jan 23 10:36:23 compute-2 nova_compute[225701]: 2026-01-23 10:36:23.534 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:36:23 compute-2 nova_compute[225701]: 2026-01-23 10:36:23.536 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4848MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:36:23 compute-2 nova_compute[225701]: 2026-01-23 10:36:23.536 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:36:23 compute-2 nova_compute[225701]: 2026-01-23 10:36:23.536 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:36:23 compute-2 sudo[248469]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:36:23 compute-2 sudo[248469]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:36:23 compute-2 nova_compute[225701]: 2026-01-23 10:36:23.909 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:36:23 compute-2 nova_compute[225701]: 2026-01-23 10:36:23.909 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:36:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:24.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:36:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:24.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:36:24 compute-2 sudo[248469]: pam_unix(sudo:session): session closed for user root
Jan 23 10:36:24 compute-2 nova_compute[225701]: 2026-01-23 10:36:24.048 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:36:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:24 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/4289693492' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:36:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:24 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:36:24 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1368383346' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:36:24 compute-2 nova_compute[225701]: 2026-01-23 10:36:24.477 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:36:24 compute-2 nova_compute[225701]: 2026-01-23 10:36:24.483 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:36:24 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:36:24 compute-2 nova_compute[225701]: 2026-01-23 10:36:24.947 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:25 compute-2 nova_compute[225701]: 2026-01-23 10:36:25.222 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:36:25 compute-2 nova_compute[225701]: 2026-01-23 10:36:25.224 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:36:25 compute-2 nova_compute[225701]: 2026-01-23 10:36:25.224 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.688s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:36:25 compute-2 ceph-mon[75771]: pgmap v1300: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:25 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1368383346' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:36:25 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1080446153' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:36:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:26.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:26.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:26 compute-2 nova_compute[225701]: 2026-01-23 10:36:26.225 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:36:26 compute-2 nova_compute[225701]: 2026-01-23 10:36:26.225 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:36:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:26 compute-2 ceph-mon[75771]: pgmap v1301: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:26 compute-2 nova_compute[225701]: 2026-01-23 10:36:26.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:36:26 compute-2 nova_compute[225701]: 2026-01-23 10:36:26.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:36:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:27 compute-2 nova_compute[225701]: 2026-01-23 10:36:27.608 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:27 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1659114460' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:36:27 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:36:27 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:36:27 compute-2 nova_compute[225701]: 2026-01-23 10:36:27.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:36:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:28.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:28.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:28 compute-2 ceph-mon[75771]: pgmap v1302: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:36:28 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:36:28 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:36:28 compute-2 ceph-mon[75771]: pgmap v1303: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 601 B/s rd, 0 op/s
Jan 23 10:36:28 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:36:28 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:36:28 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:36:28 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:36:28 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:36:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:29 compute-2 sudo[248553]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:36:29 compute-2 sudo[248553]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:36:29 compute-2 sudo[248553]: pam_unix(sudo:session): session closed for user root
Jan 23 10:36:29 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:36:29 compute-2 nova_compute[225701]: 2026-01-23 10:36:29.950 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:30.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:36:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:30.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:36:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:30 compute-2 ceph-mon[75771]: pgmap v1304: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 601 B/s rd, 0 op/s
Jan 23 10:36:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:32.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:36:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:32.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:36:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:32 compute-2 nova_compute[225701]: 2026-01-23 10:36:32.612 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:33 compute-2 ceph-mon[75771]: pgmap v1305: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 601 B/s rd, 0 op/s
Jan 23 10:36:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:34.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:34.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:34 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:36:34 compute-2 ceph-mon[75771]: pgmap v1306: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 601 B/s rd, 0 op/s
Jan 23 10:36:34 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:36:34 compute-2 sudo[248584]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:36:34 compute-2 sudo[248584]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:36:34 compute-2 nova_compute[225701]: 2026-01-23 10:36:34.951 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:34 compute-2 sudo[248584]: pam_unix(sudo:session): session closed for user root
Jan 23 10:36:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:36:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:36:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:36.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:36:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:36.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:36:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:36 compute-2 nova_compute[225701]: 2026-01-23 10:36:36.837 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:36:36 compute-2 nova_compute[225701]: 2026-01-23 10:36:36.837 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 23 10:36:36 compute-2 nova_compute[225701]: 2026-01-23 10:36:36.936 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 23 10:36:36 compute-2 ceph-mon[75771]: pgmap v1307: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 601 B/s rd, 0 op/s
Jan 23 10:36:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:37 compute-2 nova_compute[225701]: 2026-01-23 10:36:37.614 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:38.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:38.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:39 compute-2 ceph-mon[75771]: pgmap v1308: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 601 B/s rd, 0 op/s
Jan 23 10:36:39 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:36:39 compute-2 nova_compute[225701]: 2026-01-23 10:36:39.953 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:36:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:40.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:36:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:40.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:41 compute-2 ceph-mon[75771]: pgmap v1309: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:42.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:36:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:42.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:36:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:42 compute-2 nova_compute[225701]: 2026-01-23 10:36:42.617 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:43 compute-2 ceph-mon[75771]: pgmap v1310: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:36:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:44.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:44.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:44 compute-2 podman[248621]: 2026-01-23 10:36:44.622909618 +0000 UTC m=+0.051558484 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:36:44 compute-2 podman[248620]: 2026-01-23 10:36:44.656614773 +0000 UTC m=+0.085572516 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 10:36:44 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:36:44 compute-2 nova_compute[225701]: 2026-01-23 10:36:44.955 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:45 compute-2 ceph-mon[75771]: pgmap v1311: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:46.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:46.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:47 compute-2 ceph-mon[75771]: pgmap v1312: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:47 compute-2 nova_compute[225701]: 2026-01-23 10:36:47.619 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:48.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:36:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:48.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:36:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:48 compute-2 ceph-mon[75771]: pgmap v1313: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:36:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/2195883716' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:36:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/2195883716' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:36:49 compute-2 sudo[248668]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:36:49 compute-2 sudo[248668]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:36:49 compute-2 sudo[248668]: pam_unix(sudo:session): session closed for user root
Jan 23 10:36:49 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:36:49 compute-2 nova_compute[225701]: 2026-01-23 10:36:49.955 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:50.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:50.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:50 compute-2 ceph-mon[75771]: pgmap v1314: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:36:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:36:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:52.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:36:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:36:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:52.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:36:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:52 compute-2 nova_compute[225701]: 2026-01-23 10:36:52.622 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:52 compute-2 ceph-mon[75771]: pgmap v1315: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:36:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:54.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:54.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:54 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:36:54 compute-2 ceph-mon[75771]: pgmap v1316: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:54 compute-2 nova_compute[225701]: 2026-01-23 10:36:54.957 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:36:55.508 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:36:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:36:55.509 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:36:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:36:55.509 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:36:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:56.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:36:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:56.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:36:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:57 compute-2 ceph-mon[75771]: pgmap v1317: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:57 compute-2 nova_compute[225701]: 2026-01-23 10:36:57.625 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:58.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:36:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:36:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:58.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:36:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:58 compute-2 ceph-mon[75771]: pgmap v1318: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:36:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:36:59 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:36:59 compute-2 nova_compute[225701]: 2026-01-23 10:36:59.959 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:00.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:37:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:00.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:37:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:00 compute-2 ceph-mon[75771]: pgmap v1319: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:02.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:02.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:02 compute-2 nova_compute[225701]: 2026-01-23 10:37:02.670 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:03 compute-2 ceph-mon[75771]: pgmap v1320: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:37:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:04.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:04.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:04 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:37:05 compute-2 nova_compute[225701]: 2026-01-23 10:37:05.005 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:05 compute-2 ceph-mon[75771]: pgmap v1321: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:37:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:06.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:06.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:06 compute-2 ceph-mon[75771]: pgmap v1322: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:07 compute-2 nova_compute[225701]: 2026-01-23 10:37:07.673 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:08.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:08.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:09 compute-2 ceph-mon[75771]: pgmap v1323: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:37:09 compute-2 sudo[248713]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:37:09 compute-2 sudo[248713]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:37:09 compute-2 sudo[248713]: pam_unix(sudo:session): session closed for user root
Jan 23 10:37:09 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:37:10 compute-2 nova_compute[225701]: 2026-01-23 10:37:10.007 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:10.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:10.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:10 compute-2 ceph-mon[75771]: pgmap v1324: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:12.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:12.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:12 compute-2 nova_compute[225701]: 2026-01-23 10:37:12.675 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:13 compute-2 ceph-mon[75771]: pgmap v1325: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:37:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:37:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:14.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:37:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:37:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:14.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:37:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:14 compute-2 ceph-mon[75771]: pgmap v1326: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:14 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:37:14 compute-2 nova_compute[225701]: 2026-01-23 10:37:14.877 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:37:15 compute-2 nova_compute[225701]: 2026-01-23 10:37:15.009 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:15 compute-2 podman[248745]: 2026-01-23 10:37:15.622389727 +0000 UTC m=+0.046092589 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 10:37:15 compute-2 podman[248744]: 2026-01-23 10:37:15.660556852 +0000 UTC m=+0.088573491 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 23 10:37:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:16.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:37:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:16.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:37:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:16 compute-2 ceph-mon[75771]: pgmap v1327: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:17 compute-2 nova_compute[225701]: 2026-01-23 10:37:17.678 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:17 compute-2 nova_compute[225701]: 2026-01-23 10:37:17.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:37:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:18.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:18.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:19 compute-2 ceph-mon[75771]: pgmap v1328: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:37:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:19 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:37:19 compute-2 nova_compute[225701]: 2026-01-23 10:37:19.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:37:19 compute-2 nova_compute[225701]: 2026-01-23 10:37:19.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:37:19 compute-2 nova_compute[225701]: 2026-01-23 10:37:19.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:37:19 compute-2 nova_compute[225701]: 2026-01-23 10:37:19.801 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:37:20 compute-2 nova_compute[225701]: 2026-01-23 10:37:20.011 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:20.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:20.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:37:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:21 compute-2 ceph-mon[75771]: pgmap v1329: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:21 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2129370149' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:37:21 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3810051989' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:37:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:21 compute-2 nova_compute[225701]: 2026-01-23 10:37:21.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:37:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:37:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:22.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:37:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:22.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:22 compute-2 nova_compute[225701]: 2026-01-23 10:37:22.681 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:22 compute-2 ceph-mon[75771]: pgmap v1330: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:37:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:23 compute-2 nova_compute[225701]: 2026-01-23 10:37:23.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:37:23 compute-2 nova_compute[225701]: 2026-01-23 10:37:23.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:37:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:37:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:24.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:37:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:24.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:24 compute-2 nova_compute[225701]: 2026-01-23 10:37:24.400 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:37:24 compute-2 nova_compute[225701]: 2026-01-23 10:37:24.401 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:37:24 compute-2 nova_compute[225701]: 2026-01-23 10:37:24.401 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:37:24 compute-2 nova_compute[225701]: 2026-01-23 10:37:24.401 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:37:24 compute-2 nova_compute[225701]: 2026-01-23 10:37:24.402 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:37:24 compute-2 ceph-mon[75771]: pgmap v1331: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:24 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:37:24 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:37:24 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/583159563' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:37:24 compute-2 nova_compute[225701]: 2026-01-23 10:37:24.852 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:37:24 compute-2 nova_compute[225701]: 2026-01-23 10:37:24.993 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:37:24 compute-2 nova_compute[225701]: 2026-01-23 10:37:24.995 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4863MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:37:24 compute-2 nova_compute[225701]: 2026-01-23 10:37:24.995 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:37:24 compute-2 nova_compute[225701]: 2026-01-23 10:37:24.995 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:37:25 compute-2 nova_compute[225701]: 2026-01-23 10:37:25.012 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:25 compute-2 nova_compute[225701]: 2026-01-23 10:37:25.391 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:37:25 compute-2 nova_compute[225701]: 2026-01-23 10:37:25.392 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:37:25 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3870511584' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:37:25 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/583159563' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:37:25 compute-2 nova_compute[225701]: 2026-01-23 10:37:25.503 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:37:25 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:37:25 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/574915668' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:37:25 compute-2 nova_compute[225701]: 2026-01-23 10:37:25.956 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:37:25 compute-2 nova_compute[225701]: 2026-01-23 10:37:25.961 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:37:25 compute-2 nova_compute[225701]: 2026-01-23 10:37:25.979 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:37:25 compute-2 nova_compute[225701]: 2026-01-23 10:37:25.980 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:37:25 compute-2 nova_compute[225701]: 2026-01-23 10:37:25.981 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.985s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:37:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:26.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:26.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:26 compute-2 ceph-mon[75771]: pgmap v1332: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:26 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/574915668' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:37:26 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3220990230' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:37:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:27 compute-2 nova_compute[225701]: 2026-01-23 10:37:27.711 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:27 compute-2 nova_compute[225701]: 2026-01-23 10:37:27.981 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:37:27 compute-2 nova_compute[225701]: 2026-01-23 10:37:27.982 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:37:27 compute-2 nova_compute[225701]: 2026-01-23 10:37:27.982 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:37:27 compute-2 nova_compute[225701]: 2026-01-23 10:37:27.982 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:37:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:28.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:28.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:28 compute-2 nova_compute[225701]: 2026-01-23 10:37:28.779 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:37:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:29 compute-2 ceph-mon[75771]: pgmap v1333: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:37:29 compute-2 sudo[248849]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:37:29 compute-2 sudo[248849]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:37:29 compute-2 sudo[248849]: pam_unix(sudo:session): session closed for user root
Jan 23 10:37:29 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:37:30 compute-2 nova_compute[225701]: 2026-01-23 10:37:30.013 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:37:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:30.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:37:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:30.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:30 compute-2 ceph-mon[75771]: pgmap v1334: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:37:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:32.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:37:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:32.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:32 compute-2 nova_compute[225701]: 2026-01-23 10:37:32.714 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:32 compute-2 ceph-mon[75771]: pgmap v1335: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:37:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:34.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:34.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:34 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:37:35 compute-2 nova_compute[225701]: 2026-01-23 10:37:35.016 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:35 compute-2 sudo[248880]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:37:35 compute-2 sudo[248880]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:37:35 compute-2 sudo[248880]: pam_unix(sudo:session): session closed for user root
Jan 23 10:37:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:35 compute-2 sudo[248905]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:37:35 compute-2 sudo[248905]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:37:35 compute-2 ceph-mon[75771]: pgmap v1336: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:37:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:35 compute-2 sudo[248905]: pam_unix(sudo:session): session closed for user root
Jan 23 10:37:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:36.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:36.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:37 compute-2 ceph-mon[75771]: pgmap v1337: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:37 compute-2 nova_compute[225701]: 2026-01-23 10:37:37.779 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:38.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:38.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:39 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:37:40 compute-2 nova_compute[225701]: 2026-01-23 10:37:40.064 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:40 compute-2 ceph-mon[75771]: pgmap v1338: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:37:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:37:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:40.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:37:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:37:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:40.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:37:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:41 compute-2 ceph-mon[75771]: pgmap v1339: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:41 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:37:41 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:37:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:37:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:42.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:37:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:37:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:42.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:37:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:42 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:37:42 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:37:42 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:37:42 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:37:42 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:37:42 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:37:42 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:37:42 compute-2 nova_compute[225701]: 2026-01-23 10:37:42.782 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:43 compute-2 ceph-mon[75771]: pgmap v1340: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 775 B/s rd, 0 op/s
Jan 23 10:37:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:37:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:44.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:37:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:44.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:44 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:37:44 compute-2 ceph-mon[75771]: pgmap v1341: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 517 B/s rd, 0 op/s
Jan 23 10:37:45 compute-2 nova_compute[225701]: 2026-01-23 10:37:45.064 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:46.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:46.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:46 compute-2 podman[248976]: 2026-01-23 10:37:46.654773213 +0000 UTC m=+0.071512882 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:37:46 compute-2 podman[248975]: 2026-01-23 10:37:46.686422388 +0000 UTC m=+0.110254331 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 10:37:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:47 compute-2 ceph-mon[75771]: pgmap v1342: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 517 B/s rd, 0 op/s
Jan 23 10:37:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:47 compute-2 nova_compute[225701]: 2026-01-23 10:37:47.784 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:48.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:37:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:48.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:37:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:48 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Jan 23 10:37:48 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:48.971528) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:37:48 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Jan 23 10:37:48 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164668971878, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 1356, "num_deletes": 257, "total_data_size": 3394062, "memory_usage": 3463504, "flush_reason": "Manual Compaction"}
Jan 23 10:37:48 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Jan 23 10:37:48 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164668991936, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 2198714, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39492, "largest_seqno": 40843, "table_properties": {"data_size": 2192833, "index_size": 3208, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12447, "raw_average_key_size": 19, "raw_value_size": 2180973, "raw_average_value_size": 3450, "num_data_blocks": 137, "num_entries": 632, "num_filter_entries": 632, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164555, "oldest_key_time": 1769164555, "file_creation_time": 1769164668, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:37:48 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 20432 microseconds, and 9787 cpu microseconds.
Jan 23 10:37:48 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:48.992052) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 2198714 bytes OK
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:48.992100) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:49.033821) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:49.033877) EVENT_LOG_v1 {"time_micros": 1769164669033867, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:49.033906) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 3387697, prev total WAL file size 3387697, number of live WAL files 2.
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:49.035510) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031303130' seq:72057594037927935, type:22 .. '6C6F676D0031323633' seq:0, type:0; will stop at (end)
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(2147KB)], [75(12MB)]
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164669035679, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 15313433, "oldest_snapshot_seqno": -1}
Jan 23 10:37:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 6941 keys, 15160705 bytes, temperature: kUnknown
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164669255653, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 15160705, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15114169, "index_size": 28056, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17413, "raw_key_size": 182779, "raw_average_key_size": 26, "raw_value_size": 14988903, "raw_average_value_size": 2159, "num_data_blocks": 1100, "num_entries": 6941, "num_filter_entries": 6941, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769164669, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:49.256053) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 15160705 bytes
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:49.257422) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 69.6 rd, 68.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 12.5 +0.0 blob) out(14.5 +0.0 blob), read-write-amplify(13.9) write-amplify(6.9) OK, records in: 7471, records dropped: 530 output_compression: NoCompression
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:49.257440) EVENT_LOG_v1 {"time_micros": 1769164669257431, "job": 46, "event": "compaction_finished", "compaction_time_micros": 220140, "compaction_time_cpu_micros": 54695, "output_level": 6, "num_output_files": 1, "total_output_size": 15160705, "num_input_records": 7471, "num_output_records": 6941, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164669257990, "job": 46, "event": "table_file_deletion", "file_number": 77}
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164669260356, "job": 46, "event": "table_file_deletion", "file_number": 75}
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:49.034799) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:49.260411) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:49.260415) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:49.260416) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:49.260418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:37:49 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:49.260419) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:37:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:49 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:37:49 compute-2 sudo[249021]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:37:49 compute-2 sudo[249021]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:37:49 compute-2 sudo[249021]: pam_unix(sudo:session): session closed for user root
Jan 23 10:37:49 compute-2 ceph-mon[75771]: pgmap v1343: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 775 B/s rd, 0 op/s
Jan 23 10:37:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:50 compute-2 nova_compute[225701]: 2026-01-23 10:37:50.109 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:37:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:50.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:37:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:50.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:51 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/980483264' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:37:51 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/980483264' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:37:51 compute-2 ceph-mon[75771]: pgmap v1344: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 517 B/s rd, 0 op/s
Jan 23 10:37:51 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:37:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:37:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:52.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:37:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:37:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:52.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:37:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:52 compute-2 nova_compute[225701]: 2026-01-23 10:37:52.788 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:37:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:54.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:37:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:54.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:54 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:37:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:55 compute-2 nova_compute[225701]: 2026-01-23 10:37:55.112 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:55 compute-2 ceph-mon[75771]: pgmap v1345: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 775 B/s rd, 0 op/s
Jan 23 10:37:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:37:55.509 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:37:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:37:55.510 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:37:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:37:55.510 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:37:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:37:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:56.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:37:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:56.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:56 compute-2 sudo[249052]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:37:56 compute-2 sudo[249052]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:37:56 compute-2 sudo[249052]: pam_unix(sudo:session): session closed for user root
Jan 23 10:37:56 compute-2 ceph-mon[75771]: pgmap v1346: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:56 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:37:56 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:37:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:57 compute-2 nova_compute[225701]: 2026-01-23 10:37:57.791 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:58 compute-2 ceph-mon[75771]: pgmap v1347: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:58.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:37:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:37:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:58.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:37:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:37:59 compute-2 ceph-mon[75771]: pgmap v1348: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:37:59 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:38:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:00 compute-2 nova_compute[225701]: 2026-01-23 10:38:00.115 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:38:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:00.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:38:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:00.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:00 compute-2 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 10:38:00 compute-2 ceph-mon[75771]: pgmap v1349: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:38:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:02.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:38:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:38:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:02.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:38:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:02 compute-2 nova_compute[225701]: 2026-01-23 10:38:02.795 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:03 compute-2 ceph-mon[75771]: pgmap v1350: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:38:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:04.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:38:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:04.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:38:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:04 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:38:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:05 compute-2 nova_compute[225701]: 2026-01-23 10:38:05.116 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:05 compute-2 ceph-mon[75771]: pgmap v1351: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:38:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:38:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:06.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:38:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:38:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:06.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:38:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:07 compute-2 ceph-mon[75771]: pgmap v1352: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:07 compute-2 nova_compute[225701]: 2026-01-23 10:38:07.797 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:08.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:08.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:08 compute-2 ceph-mon[75771]: pgmap v1353: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:38:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:09 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:38:09 compute-2 sudo[249092]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:38:09 compute-2 sudo[249092]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:38:09 compute-2 sudo[249092]: pam_unix(sudo:session): session closed for user root
Jan 23 10:38:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:10 compute-2 nova_compute[225701]: 2026-01-23 10:38:10.120 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:38:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:10.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:38:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:10.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:11 compute-2 ceph-mon[75771]: pgmap v1354: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:38:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:12.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:38:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:12.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:12 compute-2 ceph-mon[75771]: pgmap v1355: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:38:12 compute-2 nova_compute[225701]: 2026-01-23 10:38:12.841 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:38:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:14.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:38:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:14.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:14 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:38:14 compute-2 ceph-mon[75771]: pgmap v1356: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:15 compute-2 nova_compute[225701]: 2026-01-23 10:38:15.122 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:16.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:16.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:16 compute-2 nova_compute[225701]: 2026-01-23 10:38:16.800 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:38:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:17 compute-2 ceph-mon[75771]: pgmap v1357: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:17 compute-2 podman[249126]: 2026-01-23 10:38:17.625226995 +0000 UTC m=+0.045677880 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:38:17 compute-2 podman[249125]: 2026-01-23 10:38:17.658591712 +0000 UTC m=+0.085772252 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 10:38:17 compute-2 nova_compute[225701]: 2026-01-23 10:38:17.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:38:17 compute-2 nova_compute[225701]: 2026-01-23 10:38:17.844 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:18.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:18.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:19 compute-2 ceph-mon[75771]: pgmap v1358: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:38:19 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:38:19 compute-2 nova_compute[225701]: 2026-01-23 10:38:19.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:38:19 compute-2 nova_compute[225701]: 2026-01-23 10:38:19.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:38:19 compute-2 nova_compute[225701]: 2026-01-23 10:38:19.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:38:19 compute-2 nova_compute[225701]: 2026-01-23 10:38:19.797 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:38:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:20 compute-2 nova_compute[225701]: 2026-01-23 10:38:20.124 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:20.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:20.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:21 compute-2 ceph-mon[75771]: pgmap v1359: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:21 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:38:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:22 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2310328428' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:38:22 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/387035444' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:38:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:38:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:22.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:38:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:22.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:22 compute-2 nova_compute[225701]: 2026-01-23 10:38:22.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:38:22 compute-2 nova_compute[225701]: 2026-01-23 10:38:22.847 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:23 compute-2 ceph-mon[75771]: pgmap v1360: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:38:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:23 compute-2 nova_compute[225701]: 2026-01-23 10:38:23.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:38:23 compute-2 nova_compute[225701]: 2026-01-23 10:38:23.809 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:38:23 compute-2 nova_compute[225701]: 2026-01-23 10:38:23.809 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:38:23 compute-2 nova_compute[225701]: 2026-01-23 10:38:23.809 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:38:23 compute-2 nova_compute[225701]: 2026-01-23 10:38:23.810 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:38:23 compute-2 nova_compute[225701]: 2026-01-23 10:38:23.810 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:38:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:38:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:24.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:38:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:24.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:24 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:38:24 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/15073574' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:38:24 compute-2 nova_compute[225701]: 2026-01-23 10:38:24.281 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:38:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:24 compute-2 nova_compute[225701]: 2026-01-23 10:38:24.426 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:38:24 compute-2 nova_compute[225701]: 2026-01-23 10:38:24.427 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4876MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:38:24 compute-2 nova_compute[225701]: 2026-01-23 10:38:24.428 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:38:24 compute-2 nova_compute[225701]: 2026-01-23 10:38:24.428 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:38:24 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3620960525' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:38:24 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:38:25 compute-2 nova_compute[225701]: 2026-01-23 10:38:25.000 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:38:25 compute-2 nova_compute[225701]: 2026-01-23 10:38:25.000 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:38:25 compute-2 nova_compute[225701]: 2026-01-23 10:38:25.013 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Refreshing inventories for resource provider db762d15-510c-4120-bfc4-afe76b90b657 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 23 10:38:25 compute-2 nova_compute[225701]: 2026-01-23 10:38:25.070 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Updating ProviderTree inventory for provider db762d15-510c-4120-bfc4-afe76b90b657 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 23 10:38:25 compute-2 nova_compute[225701]: 2026-01-23 10:38:25.071 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Updating inventory in ProviderTree for provider db762d15-510c-4120-bfc4-afe76b90b657 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 10:38:25 compute-2 nova_compute[225701]: 2026-01-23 10:38:25.082 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Refreshing aggregate associations for resource provider db762d15-510c-4120-bfc4-afe76b90b657, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 23 10:38:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:25 compute-2 nova_compute[225701]: 2026-01-23 10:38:25.106 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Refreshing trait associations for resource provider db762d15-510c-4120-bfc4-afe76b90b657, traits: COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX2,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_F16C,HW_CPU_X86_ABM,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_BMI2,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SVM,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_FDC,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_CLMUL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSSE3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 23 10:38:25 compute-2 nova_compute[225701]: 2026-01-23 10:38:25.121 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:38:25 compute-2 nova_compute[225701]: 2026-01-23 10:38:25.139 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:25 compute-2 ceph-mon[75771]: pgmap v1361: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:25 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/15073574' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:38:25 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/542188124' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:38:25 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:38:25 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3162061129' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:38:25 compute-2 nova_compute[225701]: 2026-01-23 10:38:25.580 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:38:25 compute-2 nova_compute[225701]: 2026-01-23 10:38:25.587 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:38:25 compute-2 nova_compute[225701]: 2026-01-23 10:38:25.601 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:38:25 compute-2 nova_compute[225701]: 2026-01-23 10:38:25.604 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:38:25 compute-2 nova_compute[225701]: 2026-01-23 10:38:25.604 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:38:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:26.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:26.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:26 compute-2 ceph-mon[75771]: pgmap v1362: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:26 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3162061129' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:38:26 compute-2 nova_compute[225701]: 2026-01-23 10:38:26.605 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:38:26 compute-2 nova_compute[225701]: 2026-01-23 10:38:26.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:38:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:27 compute-2 nova_compute[225701]: 2026-01-23 10:38:27.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:38:27 compute-2 nova_compute[225701]: 2026-01-23 10:38:27.848 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:28.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:28.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:28 compute-2 nova_compute[225701]: 2026-01-23 10:38:28.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:38:28 compute-2 nova_compute[225701]: 2026-01-23 10:38:28.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:38:28 compute-2 ceph-mon[75771]: pgmap v1363: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:38:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:29 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:38:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:30 compute-2 nova_compute[225701]: 2026-01-23 10:38:30.126 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:30.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:30.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:31 compute-2 sudo[249228]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:38:31 compute-2 sudo[249228]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:38:31 compute-2 sudo[249228]: pam_unix(sudo:session): session closed for user root
Jan 23 10:38:31 compute-2 ceph-mon[75771]: pgmap v1364: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:38:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:32.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:38:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:32.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:32 compute-2 nova_compute[225701]: 2026-01-23 10:38:32.851 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:33 compute-2 ceph-mon[75771]: pgmap v1365: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:38:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:34.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:34.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:34 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:38:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:35 compute-2 nova_compute[225701]: 2026-01-23 10:38:35.127 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:35 compute-2 ceph-mon[75771]: pgmap v1366: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:38:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:36.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:38:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:36.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:38:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:37 compute-2 ceph-mon[75771]: pgmap v1367: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:37 compute-2 nova_compute[225701]: 2026-01-23 10:38:37.854 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:38.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:38.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:38 compute-2 ceph-mon[75771]: pgmap v1368: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:38:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:39 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:38:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:40 compute-2 nova_compute[225701]: 2026-01-23 10:38:40.129 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:40.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:40.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:40 compute-2 ceph-mon[75771]: pgmap v1369: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:42.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:42.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:42 compute-2 nova_compute[225701]: 2026-01-23 10:38:42.856 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:43 compute-2 ceph-mon[75771]: pgmap v1370: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:38:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:44.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:44.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:44 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:38:45 compute-2 ceph-mon[75771]: pgmap v1371: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:45 compute-2 nova_compute[225701]: 2026-01-23 10:38:45.130 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:38:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:46.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:38:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:46.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:47 compute-2 ceph-mon[75771]: pgmap v1372: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:47 compute-2 nova_compute[225701]: 2026-01-23 10:38:47.858 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:48.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:48.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:48 compute-2 podman[249271]: 2026-01-23 10:38:48.62874471 +0000 UTC m=+0.050727203 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 10:38:48 compute-2 podman[249270]: 2026-01-23 10:38:48.654458759 +0000 UTC m=+0.079901927 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 23 10:38:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:49 compute-2 ceph-mon[75771]: pgmap v1373: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:38:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/3411057269' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:38:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/3411057269' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:38:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:49 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:38:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:50 compute-2 nova_compute[225701]: 2026-01-23 10:38:50.131 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:50.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:50.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:38:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:51 compute-2 sudo[249318]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:38:51 compute-2 sudo[249318]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:38:51 compute-2 sudo[249318]: pam_unix(sudo:session): session closed for user root
Jan 23 10:38:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:51 compute-2 ceph-mon[75771]: pgmap v1374: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:52.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:52.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:52 compute-2 nova_compute[225701]: 2026-01-23 10:38:52.861 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:53 compute-2 ceph-mon[75771]: pgmap v1375: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:38:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:54.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:54.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:54 compute-2 ceph-mon[75771]: pgmap v1376: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:54 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:38:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:55 compute-2 nova_compute[225701]: 2026-01-23 10:38:55.133 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:38:55.511 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:38:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:38:55.512 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:38:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:38:55.512 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:38:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:56.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:56.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:56 compute-2 sudo[249347]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:38:56 compute-2 sudo[249347]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:38:56 compute-2 sudo[249347]: pam_unix(sudo:session): session closed for user root
Jan 23 10:38:56 compute-2 sudo[249372]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:38:56 compute-2 sudo[249372]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:38:56 compute-2 ceph-mon[75771]: pgmap v1377: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:57 compute-2 sudo[249372]: pam_unix(sudo:session): session closed for user root
Jan 23 10:38:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:57 compute-2 nova_compute[225701]: 2026-01-23 10:38:57.863 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:58.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:38:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:58.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:58 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:38:58 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:38:58 compute-2 ceph-mon[75771]: pgmap v1378: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 793 B/s rd, 0 op/s
Jan 23 10:38:58 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:38:58 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:38:58 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:38:58 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:38:58 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:38:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:38:59 compute-2 ceph-mon[75771]: pgmap v1379: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 529 B/s rd, 0 op/s
Jan 23 10:38:59 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:39:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:00 compute-2 nova_compute[225701]: 2026-01-23 10:39:00.135 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:00.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:39:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:00.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:39:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:01 compute-2 ceph-mon[75771]: pgmap v1380: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 793 B/s rd, 0 op/s
Jan 23 10:39:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.002000048s ======
Jan 23 10:39:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:02.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000048s
Jan 23 10:39:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:02.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:02 compute-2 nova_compute[225701]: 2026-01-23 10:39:02.866 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:04 compute-2 sudo[249436]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:39:04 compute-2 sudo[249436]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:39:04 compute-2 sudo[249436]: pam_unix(sudo:session): session closed for user root
Jan 23 10:39:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:04.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:04 compute-2 ceph-mon[75771]: pgmap v1381: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 529 B/s rd, 0 op/s
Jan 23 10:39:04 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:39:04 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:39:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:39:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:04.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:39:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:04 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:39:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:05 compute-2 nova_compute[225701]: 2026-01-23 10:39:05.138 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:05 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:39:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:06.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:06.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:06 compute-2 ceph-mon[75771]: pgmap v1382: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 529 B/s rd, 0 op/s
Jan 23 10:39:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:07 compute-2 nova_compute[225701]: 2026-01-23 10:39:07.868 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:08.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:08.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:08 compute-2 ceph-mon[75771]: pgmap v1383: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 793 B/s rd, 0 op/s
Jan 23 10:39:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:09 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:39:09 compute-2 ceph-mon[75771]: pgmap v1384: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:10 compute-2 nova_compute[225701]: 2026-01-23 10:39:10.177 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:10.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:39:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:10.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:39:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:11 compute-2 sudo[249469]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:39:11 compute-2 sudo[249469]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:39:11 compute-2 sudo[249469]: pam_unix(sudo:session): session closed for user root
Jan 23 10:39:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:12 compute-2 ceph-mon[75771]: pgmap v1385: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:39:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:12.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:39:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:12.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:39:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:12 compute-2 nova_compute[225701]: 2026-01-23 10:39:12.870 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:13 compute-2 ceph-mon[75771]: pgmap v1386: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:14.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:14.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:14 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:39:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:15 compute-2 nova_compute[225701]: 2026-01-23 10:39:15.180 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:15 compute-2 nova_compute[225701]: 2026-01-23 10:39:15.779 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:39:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:16.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:16.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:16 compute-2 ceph-mon[75771]: pgmap v1387: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:17 compute-2 ceph-mon[75771]: pgmap v1388: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:39:17 compute-2 nova_compute[225701]: 2026-01-23 10:39:17.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:39:17 compute-2 nova_compute[225701]: 2026-01-23 10:39:17.872 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:18.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:18.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:19 compute-2 podman[249503]: 2026-01-23 10:39:19.63767649 +0000 UTC m=+0.060130154 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 10:39:19 compute-2 podman[249502]: 2026-01-23 10:39:19.699490734 +0000 UTC m=+0.123477105 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 10:39:19 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:39:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:20 compute-2 nova_compute[225701]: 2026-01-23 10:39:20.229 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:20.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:20.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:20 compute-2 ceph-mon[75771]: pgmap v1389: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:20 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:39:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:21 compute-2 nova_compute[225701]: 2026-01-23 10:39:21.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:39:21 compute-2 nova_compute[225701]: 2026-01-23 10:39:21.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:39:21 compute-2 nova_compute[225701]: 2026-01-23 10:39:21.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:39:21 compute-2 ceph-mon[75771]: pgmap v1390: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:39:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:22.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:39:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:22.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:39:22 compute-2 nova_compute[225701]: 2026-01-23 10:39:22.343 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:39:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:22 compute-2 nova_compute[225701]: 2026-01-23 10:39:22.874 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:23 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1085583944' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:39:23 compute-2 nova_compute[225701]: 2026-01-23 10:39:23.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:39:23 compute-2 nova_compute[225701]: 2026-01-23 10:39:23.818 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:39:23 compute-2 nova_compute[225701]: 2026-01-23 10:39:23.819 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:39:23 compute-2 nova_compute[225701]: 2026-01-23 10:39:23.819 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:39:23 compute-2 nova_compute[225701]: 2026-01-23 10:39:23.819 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:39:23 compute-2 nova_compute[225701]: 2026-01-23 10:39:23.819 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:39:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:24.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:24.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:24 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:39:24 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3838063102' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:39:24 compute-2 nova_compute[225701]: 2026-01-23 10:39:24.347 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.528s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:39:24 compute-2 ceph-mon[75771]: pgmap v1391: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:24 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2536873901' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:39:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:24 compute-2 nova_compute[225701]: 2026-01-23 10:39:24.497 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:39:24 compute-2 nova_compute[225701]: 2026-01-23 10:39:24.499 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4862MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:39:24 compute-2 nova_compute[225701]: 2026-01-23 10:39:24.499 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:39:24 compute-2 nova_compute[225701]: 2026-01-23 10:39:24.499 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:39:24 compute-2 nova_compute[225701]: 2026-01-23 10:39:24.577 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:39:24 compute-2 nova_compute[225701]: 2026-01-23 10:39:24.578 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:39:24 compute-2 nova_compute[225701]: 2026-01-23 10:39:24.600 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:39:24 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:39:25 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:39:25 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/712866539' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:39:25 compute-2 nova_compute[225701]: 2026-01-23 10:39:25.046 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:39:25 compute-2 nova_compute[225701]: 2026-01-23 10:39:25.052 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:39:25 compute-2 nova_compute[225701]: 2026-01-23 10:39:25.067 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:39:25 compute-2 nova_compute[225701]: 2026-01-23 10:39:25.068 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:39:25 compute-2 nova_compute[225701]: 2026-01-23 10:39:25.068 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.569s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:39:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:25 compute-2 nova_compute[225701]: 2026-01-23 10:39:25.230 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:26 compute-2 nova_compute[225701]: 2026-01-23 10:39:26.069 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:39:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:26.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:26.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:26 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1040346182' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:39:26 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3838063102' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:39:26 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/712866539' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:39:26 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3515020931' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:39:26 compute-2 nova_compute[225701]: 2026-01-23 10:39:26.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:39:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:27 compute-2 nova_compute[225701]: 2026-01-23 10:39:27.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:39:27 compute-2 nova_compute[225701]: 2026-01-23 10:39:27.785 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:39:27 compute-2 nova_compute[225701]: 2026-01-23 10:39:27.876 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:39:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:28.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:39:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:28.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:28 compute-2 nova_compute[225701]: 2026-01-23 10:39:28.779 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:39:28 compute-2 ceph-mon[75771]: pgmap v1392: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:28 compute-2 ceph-mon[75771]: pgmap v1393: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:39:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:30 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:39:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:30 compute-2 nova_compute[225701]: 2026-01-23 10:39:30.232 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:30.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:30.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:30 compute-2 ceph-mon[75771]: pgmap v1394: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:30 compute-2 nova_compute[225701]: 2026-01-23 10:39:30.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:39:30 compute-2 nova_compute[225701]: 2026-01-23 10:39:30.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:39:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:31 compute-2 sudo[249605]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:39:31 compute-2 sudo[249605]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:39:31 compute-2 sudo[249605]: pam_unix(sudo:session): session closed for user root
Jan 23 10:39:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:32.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:39:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:32.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:39:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:32 compute-2 ceph-mon[75771]: pgmap v1395: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:39:32 compute-2 nova_compute[225701]: 2026-01-23 10:39:32.879 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:34 compute-2 ceph-mon[75771]: pgmap v1396: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:34.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:34.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:35 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:39:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:35 compute-2 nova_compute[225701]: 2026-01-23 10:39:35.235 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:35 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:39:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:36.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:36.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:37 compute-2 ceph-mon[75771]: pgmap v1397: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:37 compute-2 nova_compute[225701]: 2026-01-23 10:39:37.881 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #79. Immutable memtables: 0.
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.029794) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 47] Flushing memtable with next log file: 79
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164778029856, "job": 47, "event": "flush_started", "num_memtables": 1, "num_entries": 1276, "num_deletes": 251, "total_data_size": 3247013, "memory_usage": 3285208, "flush_reason": "Manual Compaction"}
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 47] Level-0 flush table #80: started
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164778070254, "cf_name": "default", "job": 47, "event": "table_file_creation", "file_number": 80, "file_size": 2102623, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 40848, "largest_seqno": 42119, "table_properties": {"data_size": 2096978, "index_size": 3039, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11955, "raw_average_key_size": 19, "raw_value_size": 2085740, "raw_average_value_size": 3482, "num_data_blocks": 131, "num_entries": 599, "num_filter_entries": 599, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164669, "oldest_key_time": 1769164669, "file_creation_time": 1769164778, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 80, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 47] Flush lasted 40516 microseconds, and 6160 cpu microseconds.
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.070314) [db/flush_job.cc:967] [default] [JOB 47] Level-0 flush table #80: 2102623 bytes OK
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.070334) [db/memtable_list.cc:519] [default] Level-0 commit table #80 started
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.073915) [db/memtable_list.cc:722] [default] Level-0 commit table #80: memtable #1 done
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.073967) EVENT_LOG_v1 {"time_micros": 1769164778073957, "job": 47, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.073992) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 47] Try to delete WAL files size 3240998, prev total WAL file size 3240998, number of live WAL files 2.
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000076.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.075091) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033323633' seq:72057594037927935, type:22 .. '7061786F730033353135' seq:0, type:0; will stop at (end)
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 48] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 47 Base level 0, inputs: [80(2053KB)], [78(14MB)]
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164778075228, "job": 48, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [80], "files_L6": [78], "score": -1, "input_data_size": 17263328, "oldest_snapshot_seqno": -1}
Jan 23 10:39:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 48] Generated table #81: 7024 keys, 14949350 bytes, temperature: kUnknown
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164778202032, "cf_name": "default", "job": 48, "event": "table_file_creation", "file_number": 81, "file_size": 14949350, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14902767, "index_size": 27911, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17605, "raw_key_size": 185222, "raw_average_key_size": 26, "raw_value_size": 14776455, "raw_average_value_size": 2103, "num_data_blocks": 1086, "num_entries": 7024, "num_filter_entries": 7024, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769164778, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 81, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.202277) [db/compaction/compaction_job.cc:1663] [default] [JOB 48] Compacted 1@0 + 1@6 files to L6 => 14949350 bytes
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.207948) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 136.1 rd, 117.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 14.5 +0.0 blob) out(14.3 +0.0 blob), read-write-amplify(15.3) write-amplify(7.1) OK, records in: 7540, records dropped: 516 output_compression: NoCompression
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.207976) EVENT_LOG_v1 {"time_micros": 1769164778207961, "job": 48, "event": "compaction_finished", "compaction_time_micros": 126872, "compaction_time_cpu_micros": 34669, "output_level": 6, "num_output_files": 1, "total_output_size": 14949350, "num_input_records": 7540, "num_output_records": 7024, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000080.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164778208446, "job": 48, "event": "table_file_deletion", "file_number": 80}
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000078.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164778211129, "job": 48, "event": "table_file_deletion", "file_number": 78}
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.075011) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.211244) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.211259) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.211266) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.211273) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:39:38 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.211279) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:39:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:38.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:38.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:38 compute-2 ceph-mon[75771]: pgmap v1398: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:39:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:39:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:40.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:39:40 compute-2 nova_compute[225701]: 2026-01-23 10:39:40.268 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:40.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:40 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:39:40 compute-2 ceph-mon[75771]: pgmap v1399: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:42.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:42.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:42 compute-2 ceph-mon[75771]: pgmap v1400: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:39:42 compute-2 nova_compute[225701]: 2026-01-23 10:39:42.884 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:44.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:44.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:44 compute-2 ceph-mon[75771]: pgmap v1401: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:45 compute-2 nova_compute[225701]: 2026-01-23 10:39:45.271 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:45 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:39:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:46.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.002000047s ======
Jan 23 10:39:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:46.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000047s
Jan 23 10:39:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:47 compute-2 ceph-mon[75771]: pgmap v1402: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:47 compute-2 nova_compute[225701]: 2026-01-23 10:39:47.886 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:48.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:48.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:48 compute-2 ceph-mon[75771]: pgmap v1403: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:39:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/786440771' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:39:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/786440771' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:39:49 compute-2 ceph-mon[75771]: pgmap v1404: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:50 compute-2 nova_compute[225701]: 2026-01-23 10:39:50.272 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:50.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:50.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:50 compute-2 podman[249650]: 2026-01-23 10:39:50.631748514 +0000 UTC m=+0.056391172 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 23 10:39:50 compute-2 podman[249649]: 2026-01-23 10:39:50.655494825 +0000 UTC m=+0.080734767 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Jan 23 10:39:50 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:39:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:51 compute-2 sudo[249691]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:39:51 compute-2 sudo[249691]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:39:51 compute-2 sudo[249691]: pam_unix(sudo:session): session closed for user root
Jan 23 10:39:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:39:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:52.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:39:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:39:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:52.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:39:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:52 compute-2 nova_compute[225701]: 2026-01-23 10:39:52.888 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:53 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:39:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:54.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:54 compute-2 ceph-mon[75771]: pgmap v1405: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:39:54 compute-2 ceph-mon[75771]: pgmap v1406: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:54.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:55 compute-2 nova_compute[225701]: 2026-01-23 10:39:55.273 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:39:55.511 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:39:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:39:55.512 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:39:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:39:55.512 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:39:55 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:39:55 compute-2 ceph-mon[75771]: pgmap v1407: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:56.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:56.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:57 compute-2 ceph-mon[75771]: pgmap v1408: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:39:57 compute-2 nova_compute[225701]: 2026-01-23 10:39:57.890 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:39:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:58.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:39:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:39:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:58.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:39:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:00 compute-2 nova_compute[225701]: 2026-01-23 10:40:00.275 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:40:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:00.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:40:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:40:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:00.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:40:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:00 compute-2 ceph-mon[75771]: pgmap v1409: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:00 compute-2 ceph-mon[75771]: overall HEALTH_WARN 2 OSD(s) experiencing slow operations in BlueStore; 2 failed cephadm daemon(s)
Jan 23 10:40:00 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:40:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:02 compute-2 ceph-mon[75771]: pgmap v1410: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:40:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:02.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:40:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:02.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:40:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:02 compute-2 nova_compute[225701]: 2026-01-23 10:40:02.893 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:04 compute-2 ceph-mon[75771]: pgmap v1411: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:40:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:04.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:40:04 compute-2 sudo[249728]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:40:04 compute-2 sudo[249728]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:40:04 compute-2 sudo[249728]: pam_unix(sudo:session): session closed for user root
Jan 23 10:40:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:04.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:04 compute-2 sudo[249753]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:40:04 compute-2 sudo[249753]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:40:04 compute-2 sudo[249753]: pam_unix(sudo:session): session closed for user root
Jan 23 10:40:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:05 compute-2 nova_compute[225701]: 2026-01-23 10:40:05.277 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:05 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:40:06 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:40:06 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:40:06 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:40:06 compute-2 ceph-mon[75771]: pgmap v1412: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:06 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:40:06 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:40:06 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:40:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:06.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:40:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:06.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:40:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:07 compute-2 ceph-mon[75771]: pgmap v1413: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 592 B/s rd, 0 op/s
Jan 23 10:40:07 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:40:07 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:40:07 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:40:07 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:40:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:07 compute-2 nova_compute[225701]: 2026-01-23 10:40:07.895 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:08.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:08.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:08 compute-2 ceph-mon[75771]: pgmap v1414: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 593 B/s rd, 0 op/s
Jan 23 10:40:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:10 compute-2 nova_compute[225701]: 2026-01-23 10:40:10.279 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:40:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:10.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:40:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:10.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:10 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:40:10 compute-2 sudo[249816]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:40:10 compute-2 sudo[249816]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:40:10 compute-2 sudo[249816]: pam_unix(sudo:session): session closed for user root
Jan 23 10:40:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:11 compute-2 ceph-mon[75771]: pgmap v1415: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 593 B/s rd, 0 op/s
Jan 23 10:40:11 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:40:11 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:40:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:11 compute-2 sudo[249842]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:40:11 compute-2 sudo[249842]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:40:11 compute-2 sudo[249842]: pam_unix(sudo:session): session closed for user root
Jan 23 10:40:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:12.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:40:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:12.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:40:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:12 compute-2 nova_compute[225701]: 2026-01-23 10:40:12.898 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:13 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:13 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:13 compute-2 ceph-mon[75771]: pgmap v1416: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 593 B/s rd, 0 op/s
Jan 23 10:40:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:14.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:14 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:14 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:14 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:14.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:14 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:14 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:14 compute-2 ceph-mon[75771]: pgmap v1417: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 593 B/s rd, 0 op/s
Jan 23 10:40:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:15 compute-2 nova_compute[225701]: 2026-01-23 10:40:15.281 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:15 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:15 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:15 compute-2 nova_compute[225701]: 2026-01-23 10:40:15.778 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:40:15 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:40:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:16.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:16 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:16 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:16 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:16.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:16 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:16 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:17 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:17 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:17 compute-2 ceph-mon[75771]: pgmap v1418: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 593 B/s rd, 0 op/s
Jan 23 10:40:17 compute-2 nova_compute[225701]: 2026-01-23 10:40:17.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:40:17 compute-2 nova_compute[225701]: 2026-01-23 10:40:17.901 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:18.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:18 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:18 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:18 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:18 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:40:18 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:18.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:40:18 compute-2 ceph-mon[75771]: pgmap v1419: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:40:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:19 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:19 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:20 compute-2 nova_compute[225701]: 2026-01-23 10:40:20.283 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:20.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:20 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:20 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:20 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:20 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:20 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:20.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:21 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:40:21 compute-2 ceph-mon[75771]: pgmap v1420: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:21 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:40:21 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:21 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:21 compute-2 podman[249878]: 2026-01-23 10:40:21.633500726 +0000 UTC m=+0.054216359 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 10:40:21 compute-2 podman[249877]: 2026-01-23 10:40:21.71755971 +0000 UTC m=+0.139012601 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 10:40:21 compute-2 nova_compute[225701]: 2026-01-23 10:40:21.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:40:21 compute-2 nova_compute[225701]: 2026-01-23 10:40:21.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:40:21 compute-2 nova_compute[225701]: 2026-01-23 10:40:21.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:40:21 compute-2 nova_compute[225701]: 2026-01-23 10:40:21.799 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:40:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:22.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:22 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:22 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:22 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:22 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:22 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:22.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:22 compute-2 nova_compute[225701]: 2026-01-23 10:40:22.904 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:23 compute-2 ceph-mon[75771]: pgmap v1421: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:40:23 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:23 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:24.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:24 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:24 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:24 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:24 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:24 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:24.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #82. Immutable memtables: 0.
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.162027) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 49] Flushing memtable with next log file: 82
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164825162289, "job": 49, "event": "flush_started", "num_memtables": 1, "num_entries": 709, "num_deletes": 251, "total_data_size": 1512212, "memory_usage": 1528376, "flush_reason": "Manual Compaction"}
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 49] Level-0 flush table #83: started
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164825171485, "cf_name": "default", "job": 49, "event": "table_file_creation", "file_number": 83, "file_size": 689452, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 42124, "largest_seqno": 42828, "table_properties": {"data_size": 686359, "index_size": 1001, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 8261, "raw_average_key_size": 20, "raw_value_size": 679885, "raw_average_value_size": 1712, "num_data_blocks": 43, "num_entries": 397, "num_filter_entries": 397, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164778, "oldest_key_time": 1769164778, "file_creation_time": 1769164825, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 83, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 49] Flush lasted 9446 microseconds, and 3964 cpu microseconds.
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.171548) [db/flush_job.cc:967] [default] [JOB 49] Level-0 flush table #83: 689452 bytes OK
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.171579) [db/memtable_list.cc:519] [default] Level-0 commit table #83 started
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.174743) [db/memtable_list.cc:722] [default] Level-0 commit table #83: memtable #1 done
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.174763) EVENT_LOG_v1 {"time_micros": 1769164825174759, "job": 49, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.174778) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 49] Try to delete WAL files size 1508402, prev total WAL file size 1508402, number of live WAL files 2.
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000079.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.175543) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031323535' seq:72057594037927935, type:22 .. '6D6772737461740031353037' seq:0, type:0; will stop at (end)
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 50] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 49 Base level 0, inputs: [83(673KB)], [81(14MB)]
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164825175713, "job": 50, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [83], "files_L6": [81], "score": -1, "input_data_size": 15638802, "oldest_snapshot_seqno": -1}
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 50] Generated table #84: 6922 keys, 11743274 bytes, temperature: kUnknown
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164825257855, "cf_name": "default", "job": 50, "event": "table_file_creation", "file_number": 84, "file_size": 11743274, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11701864, "index_size": 22994, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17349, "raw_key_size": 183305, "raw_average_key_size": 26, "raw_value_size": 11581679, "raw_average_value_size": 1673, "num_data_blocks": 892, "num_entries": 6922, "num_filter_entries": 6922, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769164825, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 84, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.258330) [db/compaction/compaction_job.cc:1663] [default] [JOB 50] Compacted 1@0 + 1@6 files to L6 => 11743274 bytes
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.260057) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 190.0 rd, 142.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 14.3 +0.0 blob) out(11.2 +0.0 blob), read-write-amplify(39.7) write-amplify(17.0) OK, records in: 7421, records dropped: 499 output_compression: NoCompression
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.260078) EVENT_LOG_v1 {"time_micros": 1769164825260069, "job": 50, "event": "compaction_finished", "compaction_time_micros": 82296, "compaction_time_cpu_micros": 36610, "output_level": 6, "num_output_files": 1, "total_output_size": 11743274, "num_input_records": 7421, "num_output_records": 6922, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000083.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164825260627, "job": 50, "event": "table_file_deletion", "file_number": 83}
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000081.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164825263747, "job": 50, "event": "table_file_deletion", "file_number": 81}
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.175365) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.263818) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.263822) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.263824) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.263826) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:40:25 compute-2 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.263827) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:40:25 compute-2 nova_compute[225701]: 2026-01-23 10:40:25.285 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:25 compute-2 ceph-mon[75771]: pgmap v1422: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:25 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1646194779' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:40:25 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:25 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:25 compute-2 nova_compute[225701]: 2026-01-23 10:40:25.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:40:25 compute-2 nova_compute[225701]: 2026-01-23 10:40:25.806 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:40:25 compute-2 nova_compute[225701]: 2026-01-23 10:40:25.806 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:40:25 compute-2 nova_compute[225701]: 2026-01-23 10:40:25.806 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:40:25 compute-2 nova_compute[225701]: 2026-01-23 10:40:25.806 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:40:25 compute-2 nova_compute[225701]: 2026-01-23 10:40:25.807 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:40:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:26 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:40:26 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:40:26 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1373761955' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:40:26 compute-2 nova_compute[225701]: 2026-01-23 10:40:26.277 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:40:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:26.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:26 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:26 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:26 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:26 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:26 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:26.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:26 compute-2 nova_compute[225701]: 2026-01-23 10:40:26.419 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:40:26 compute-2 nova_compute[225701]: 2026-01-23 10:40:26.420 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4834MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:40:26 compute-2 nova_compute[225701]: 2026-01-23 10:40:26.421 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:40:26 compute-2 nova_compute[225701]: 2026-01-23 10:40:26.421 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:40:26 compute-2 nova_compute[225701]: 2026-01-23 10:40:26.486 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:40:26 compute-2 nova_compute[225701]: 2026-01-23 10:40:26.487 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:40:26 compute-2 nova_compute[225701]: 2026-01-23 10:40:26.503 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:40:26 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/4269113456' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:40:26 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1984292967' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:40:26 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1373761955' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:40:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:27 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:40:27 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1342687352' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:40:27 compute-2 nova_compute[225701]: 2026-01-23 10:40:27.182 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.679s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:40:27 compute-2 nova_compute[225701]: 2026-01-23 10:40:27.187 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:40:27 compute-2 nova_compute[225701]: 2026-01-23 10:40:27.205 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:40:27 compute-2 nova_compute[225701]: 2026-01-23 10:40:27.206 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:40:27 compute-2 nova_compute[225701]: 2026-01-23 10:40:27.206 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.785s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:40:27 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:27 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:27 compute-2 ceph-mon[75771]: pgmap v1423: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:27 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2247912976' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:40:27 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1342687352' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:40:27 compute-2 nova_compute[225701]: 2026-01-23 10:40:27.907 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:28 compute-2 nova_compute[225701]: 2026-01-23 10:40:28.207 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:40:28 compute-2 nova_compute[225701]: 2026-01-23 10:40:28.208 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:40:28 compute-2 nova_compute[225701]: 2026-01-23 10:40:28.208 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:40:28 compute-2 nova_compute[225701]: 2026-01-23 10:40:28.208 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:40:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:28.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:28 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:28 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:28 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:28 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:40:28 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:28.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:40:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:29 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:29 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:29 compute-2 ceph-mon[75771]: pgmap v1424: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:40:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:30 compute-2 nova_compute[225701]: 2026-01-23 10:40:30.287 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:30.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:30 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:30 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:30 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:30 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:30 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:30.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:31 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:40:31 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:31 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:31 compute-2 ceph-mon[75771]: pgmap v1425: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:31 compute-2 sudo[249975]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:40:31 compute-2 sudo[249975]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:40:31 compute-2 sudo[249975]: pam_unix(sudo:session): session closed for user root
Jan 23 10:40:31 compute-2 nova_compute[225701]: 2026-01-23 10:40:31.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:40:31 compute-2 nova_compute[225701]: 2026-01-23 10:40:31.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:40:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:32.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:32 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:32 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:32 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:32 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:40:32 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:32.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:40:32 compute-2 nova_compute[225701]: 2026-01-23 10:40:32.909 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:33 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:33 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:34.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:34 compute-2 ceph-mon[75771]: pgmap v1426: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:40:34 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:34 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:34 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:34 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:34 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:34.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:35 compute-2 ceph-mon[75771]: pgmap v1427: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:35 compute-2 nova_compute[225701]: 2026-01-23 10:40:35.289 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:35 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:35 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:36 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:40:36 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:40:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:40:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:36.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:40:36 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:36 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:36 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:36 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:40:36 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:36.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:40:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:37 compute-2 ceph-mon[75771]: pgmap v1428: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:37 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:37 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:37 compute-2 nova_compute[225701]: 2026-01-23 10:40:37.912 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:38.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:38 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:38 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:38 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:38 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:40:38 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:38.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:40:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:39 compute-2 ceph-mon[75771]: pgmap v1429: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:40:39 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:39 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:40 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:40:40 compute-2 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 3000.0 total, 600.0 interval
                                           Cumulative writes: 8117 writes, 42K keys, 8117 commit groups, 1.0 writes per commit group, ingest: 0.10 GB, 0.03 MB/s
                                           Cumulative WAL: 8117 writes, 8117 syncs, 1.00 writes per sync, written: 0.10 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1522 writes, 8008 keys, 1522 commit groups, 1.0 writes per commit group, ingest: 17.39 MB, 0.03 MB/s
                                           Interval WAL: 1523 writes, 1523 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     26.7      2.26              0.24        25    0.091       0      0       0.0       0.0
                                             L6      1/0   11.20 MB   0.0      0.3     0.1      0.3       0.3      0.0       0.0   5.0     93.8     80.6      3.75              1.13        24    0.156    146K    13K       0.0       0.0
                                            Sum      1/0   11.20 MB   0.0      0.3     0.1      0.3       0.4      0.1       0.0   6.0     58.5     60.4      6.02              1.37        49    0.123    146K    13K       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   8.7    108.3    105.5      0.96              0.29        14    0.069     51K   4033       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.3     0.1      0.3       0.3      0.0       0.0   0.0     93.8     80.6      3.75              1.13        24    0.156    146K    13K       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     26.8      2.26              0.24        24    0.094       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 3000.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.059, interval 0.011
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.35 GB write, 0.12 MB/s write, 0.34 GB read, 0.12 MB/s read, 6.0 seconds
                                           Interval compaction: 0.10 GB write, 0.17 MB/s write, 0.10 GB read, 0.17 MB/s read, 1.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55c6513709b0#2 capacity: 304.00 MB usage: 31.80 MB table_size: 0 occupancy: 18446744073709551615 collections: 6 last_copies: 0 last_secs: 0.000212 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1897,30.72 MB,10.1039%) FilterBlock(49,427.73 KB,0.137404%) IndexBlock(49,682.17 KB,0.219139%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Jan 23 10:40:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:40 compute-2 nova_compute[225701]: 2026-01-23 10:40:40.292 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:40:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:40.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:40:40 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:40 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:40 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:40 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:40 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:40.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:41 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:40:41 compute-2 ceph-mon[75771]: pgmap v1430: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:41 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:41 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:42.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:42 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:42 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:42 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:42 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:40:42 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:42.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:40:42 compute-2 nova_compute[225701]: 2026-01-23 10:40:42.915 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:43 compute-2 ceph-mon[75771]: pgmap v1431: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:40:43 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:43 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:43 compute-2 sshd-session[250012]: Accepted publickey for zuul from 192.168.122.10 port 47016 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 10:40:43 compute-2 systemd-logind[786]: New session 58 of user zuul.
Jan 23 10:40:43 compute-2 systemd[1]: Started Session 58 of User zuul.
Jan 23 10:40:43 compute-2 sshd-session[250012]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 10:40:43 compute-2 sudo[250016]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Jan 23 10:40:43 compute-2 sudo[250016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:40:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:44.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:44 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:44 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:44 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:44 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:44 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:44.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:45 compute-2 ceph-mon[75771]: pgmap v1432: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:45 compute-2 nova_compute[225701]: 2026-01-23 10:40:45.295 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:45 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:45 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:46 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:40:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:46.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:46 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:46 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:46 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:46 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:46 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:46.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:47 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Jan 23 10:40:47 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2902829741' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 10:40:47 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:47 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:47 compute-2 ceph-mon[75771]: pgmap v1433: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:47 compute-2 ceph-mon[75771]: from='client.27121 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:47 compute-2 ceph-mon[75771]: from='client.27002 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:47 compute-2 ceph-mon[75771]: from='client.17607 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:47 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2902829741' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 10:40:47 compute-2 nova_compute[225701]: 2026-01-23 10:40:47.918 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:48.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:48 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:48 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:48 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:48 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:48 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:48.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:49 compute-2 ceph-mon[75771]: from='client.27130 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:49 compute-2 ceph-mon[75771]: from='client.17613 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:49 compute-2 ceph-mon[75771]: from='client.27008 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/390486485' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 10:40:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1900352961' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 10:40:49 compute-2 ceph-mon[75771]: pgmap v1434: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:40:49 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/1287907540' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:40:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:49 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:49 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:50 compute-2 nova_compute[225701]: 2026-01-23 10:40:50.295 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:50.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:50 compute-2 ovs-vsctl[250348]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 23 10:40:50 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:50 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:50 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:50 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:40:50 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:50.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:40:50 compute-2 ceph-mon[75771]: from='client.? 192.168.122.10:0/1287907540' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:40:50 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:40:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:51 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:40:51 compute-2 virtqemud[225221]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 23 10:40:51 compute-2 virtqemud[225221]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 23 10:40:51 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:51 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:51 compute-2 virtqemud[225221]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 23 10:40:51 compute-2 ceph-mon[75771]: pgmap v1435: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:51 compute-2 sudo[250563]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:40:51 compute-2 sudo[250563]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:40:51 compute-2 sudo[250563]: pam_unix(sudo:session): session closed for user root
Jan 23 10:40:51 compute-2 podman[250604]: 2026-01-23 10:40:51.781169538 +0000 UTC m=+0.060471851 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 10:40:51 compute-2 podman[250620]: 2026-01-23 10:40:51.83064516 +0000 UTC m=+0.080371924 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:40:51 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: cache status {prefix=cache status} (starting...)
Jan 23 10:40:51 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: client ls {prefix=client ls} (starting...)
Jan 23 10:40:52 compute-2 lvm[250768]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 10:40:52 compute-2 lvm[250768]: VG ceph_vg0 finished
Jan 23 10:40:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:40:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:52.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:40:52 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:52 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:52 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:52 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:52 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:52.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:52 compute-2 ceph-mon[75771]: pgmap v1436: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:40:52 compute-2 ceph-mon[75771]: from='client.27163 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:52 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: damage ls {prefix=damage ls} (starting...)
Jan 23 10:40:52 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: dump loads {prefix=dump loads} (starting...)
Jan 23 10:40:52 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Jan 23 10:40:52 compute-2 nova_compute[225701]: 2026-01-23 10:40:52.920 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:52 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Jan 23 10:40:52 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2735878044' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 10:40:53 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Jan 23 10:40:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:53 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Jan 23 10:40:53 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Jan 23 10:40:53 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Jan 23 10:40:53 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2326781007' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:40:53 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:53 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:53 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Jan 23 10:40:53 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: get subtrees {prefix=get subtrees} (starting...)
Jan 23 10:40:53 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Jan 23 10:40:53 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4283374481' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 23 10:40:53 compute-2 ceph-mon[75771]: from='client.27175 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:53 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2735878044' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 10:40:53 compute-2 ceph-mon[75771]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 10:40:53 compute-2 ceph-mon[75771]: from='client.17643 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:53 compute-2 ceph-mon[75771]: from='client.27184 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:53 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2326781007' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:40:53 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2803210379' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 10:40:54 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: ops {prefix=ops} (starting...)
Jan 23 10:40:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:54 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Jan 23 10:40:54 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1170917103' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 23 10:40:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:54.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:54 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:54 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:54 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:54 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:54 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:54.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:54 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Jan 23 10:40:54 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1278040154' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 23 10:40:54 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 23 10:40:54 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1520575926' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:40:54 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: session ls {prefix=session ls} (starting...)
Jan 23 10:40:54 compute-2 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: status {prefix=status} (starting...)
Jan 23 10:40:54 compute-2 ceph-mon[75771]: pgmap v1437: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:54 compute-2 ceph-mon[75771]: from='client.17655 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:54 compute-2 ceph-mon[75771]: from='client.27032 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:54 compute-2 ceph-mon[75771]: from='client.27199 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:54 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/4283374481' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 23 10:40:54 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2749828607' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:40:54 compute-2 ceph-mon[75771]: from='client.17667 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:54 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1569340402' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 10:40:54 compute-2 ceph-mon[75771]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 10:40:54 compute-2 ceph-mon[75771]: from='client.27044 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:54 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1170917103' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 23 10:40:54 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2987158526' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 23 10:40:54 compute-2 ceph-mon[75771]: from='client.17679 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:54 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1278040154' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 23 10:40:54 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2991488001' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:40:54 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1520575926' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:40:55 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 23 10:40:55 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/431388215' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:40:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:55 compute-2 nova_compute[225701]: 2026-01-23 10:40:55.321 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:55 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:55 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:40:55.513 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:40:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:40:55.513 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:40:55 compute-2 ovn_metadata_agent[142601]: 2026-01-23 10:40:55.513 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:40:55 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 23 10:40:55 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2891389420' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:40:55 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Jan 23 10:40:55 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2152170927' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 10:40:55 compute-2 ceph-mon[75771]: from='client.27056 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:55 compute-2 ceph-mon[75771]: from='client.27244 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:55 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1579084197' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 23 10:40:55 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1948725629' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 23 10:40:55 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2517041095' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 23 10:40:55 compute-2 ceph-mon[75771]: from='client.27068 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:55 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/431388215' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:40:55 compute-2 ceph-mon[75771]: from='client.27256 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:55 compute-2 ceph-mon[75771]: from='client.17706 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:55 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2275578091' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:40:55 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2494977871' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 23 10:40:55 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2891389420' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:40:55 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/758304118' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 23 10:40:55 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2152170927' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 10:40:55 compute-2 ceph-mon[75771]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 10:40:56 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 23 10:40:56 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/858187861' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:40:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:56 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:40:56 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Jan 23 10:40:56 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4002027261' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 23 10:40:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:56.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:56 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:56 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:56 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:56 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:56 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:56.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:56 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Jan 23 10:40:56 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/676404484' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 23 10:40:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:57 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 23 10:40:57 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3056118106' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:40:57 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Jan 23 10:40:57 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/529095491' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 23 10:40:57 compute-2 ceph-mon[75771]: pgmap v1438: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:57 compute-2 ceph-mon[75771]: from='client.17721 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:57 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/4141150626' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:40:57 compute-2 ceph-mon[75771]: from='client.27095 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:57 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/858187861' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:40:57 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3865666164' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:40:57 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2474866121' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 10:40:57 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/4002027261' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 23 10:40:57 compute-2 ceph-mon[75771]: from='client.27101 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:57 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1120773140' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:40:57 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/676404484' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 23 10:40:57 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2664983427' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 23 10:40:57 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1456795587' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:40:57 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/883122861' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 10:40:57 compute-2 ceph-mon[75771]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 10:40:57 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3387213635' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:40:57 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:57 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:57 compute-2 nova_compute[225701]: 2026-01-23 10:40:57.922 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:57 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Jan 23 10:40:57 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3096053969' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:16.339328+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:17.339467+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:18.339788+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:19.339954+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:20.340127+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:21.340363+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:22.340536+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:23.340684+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:24.340819+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:25.340934+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:26.341137+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:27.341320+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:28.341454+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:29.341691+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:30.341873+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:31.342128+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:32.342346+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:33.342489+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:34.342655+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:35.342818+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:36.343046+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:37.343200+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:38.343380+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261ec800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 63.238079071s of 63.242374420s, submitted: 1
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73383936 unmapped: 1024000 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:39.343524+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73383936 unmapped: 1024000 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:40.343659+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:41.343859+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837612 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:42.344035+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:43.344230+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:44.344391+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:45.344601+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:46.344870+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:47.345007+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:48.345210+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:49.345450+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:50.345621+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:51.345780+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:52.345935+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:53.346121+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:54.346270+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:55.346430+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:56.346627+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:57.346788+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:58.346941+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:59.347061+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:00.347209+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:01.347333+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:02.347516+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:03.347764+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:04.347900+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:05.348048+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:06.348232+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:07.348433+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:08.348572+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:09.348807+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:10.349000+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:11.349141+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:12.349298+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:13.349521+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:14.349663+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:15.349832+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:16.350076+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:17.350286+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:18.350465+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:19.350640+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:20.350849+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:21.351055+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:22.351205+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:23.351386+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:24.351641+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:25.351853+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:26.352044+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:27.352187+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:28.352527+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:29.352793+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:30.352946+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:31.353122+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:32.353300+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:33.353474+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:34.353609+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:35.353767+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:36.354039+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:37.354184+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:38.354329+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:39.354814+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:40.354989+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:41.355123+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:42.355308+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:43.355443+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:44.355596+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:45.355743+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:46.355949+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:47.356117+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:48.356430+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:49.356559+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:50.356707+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:51.356894+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:52.357120+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:53.357313+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:54.357465+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:55.357593+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:56.357817+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:57.357981+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:58.358247+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:59.358379+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:00.358614+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:01.358788+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:02.359022+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:03.359252+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:04.359463+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:05.359719+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:06.360081+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:07.360340+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:08.360546+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:09.360785+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:10.360926+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x5592261ec800 session 0x559226fee960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:11.361133+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:12.361357+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:13.361548+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:14.361756+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:15.361913+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:16.362086+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:17.362242+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:18.362384+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:19.362582+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:20.362747+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:21.362875+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:22.363052+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:23.363255+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:24.363419+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:25.363591+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:26.363853+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:27.364031+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afa000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:28.364379+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:29.364578+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:30.364811+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 110.972518921s of 111.781974792s, submitted: 2
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:31.364967+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:32.365113+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:33.365300+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:34.365472+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:35.365634+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:36.365785+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:37.365990+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:38.366159+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:39.366280+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:40.366404+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:41.366522+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:42.366669+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:43.366767+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:44.367174+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:45.367310+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:46.367528+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:47.367678+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:48.367867+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:49.368014+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:50.368174+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:51.368301+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:52.368457+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:53.368851+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:54.369037+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:55.369268+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:56.369496+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:57.369797+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:58.369986+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:59.370142+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:00.370306+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:01.370587+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:02.370807+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:03.370964+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:04.371112+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:05.371255+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:06.371526+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:07.371699+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:08.371888+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:09.372053+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:10.372206+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:11.372411+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:12.372567+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:13.372696+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:14.372875+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:15.373021+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:16.373194+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:17.373337+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:18.373535+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:19.373858+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:20.374075+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:21.374236+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:22.374390+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:23.374539+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:24.374777+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:25.374911+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:26.375062+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:27.375205+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:28.375351+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:29.375498+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:30.375650+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:31.375785+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 5807 writes, 24K keys, 5807 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s
                                           Cumulative WAL: 5807 writes, 987 syncs, 5.88 writes per sync, written: 0.02 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 440 writes, 717 keys, 440 commit groups, 1.0 writes per commit group, ingest: 0.23 MB, 0.00 MB/s
                                           Interval WAL: 440 writes, 204 syncs, 2.16 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb09b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb09b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb09b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:32.375984+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:33.376157+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:34.376349+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:35.376598+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:36.376860+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:37.377032+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:38.377277+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:39.377524+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:40.377792+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:41.377975+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:42.378156+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:43.378348+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:44.378501+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:45.378646+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:46.378882+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:47.379104+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:48.379332+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:49.379527+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:50.379697+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:51.379929+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:52.380088+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:53.380260+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:54.380408+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x5592261f1400 session 0x559226fefc20
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:55.380538+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:56.380788+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:57.380990+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:58.381168+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:59.381396+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:00.381583+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:01.381805+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:02.382018+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:03.382172+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:04.382319+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 93.923355103s of 93.927070618s, submitted: 1
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:05.382442+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74350592 unmapped: 1105920 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:06.382696+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1122304 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:07.382871+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1122304 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:08.383061+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1122304 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:09.383248+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1122304 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b04c00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:10.383512+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 1097728 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:11.383663+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74366976 unmapped: 1089536 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:12.383837+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74383360 unmapped: 1073152 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 838014 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:13.384047+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75448320 unmapped: 1056768 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226636c00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:14.384220+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 1048576 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:15.384411+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 1048576 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.652290344s of 11.764292717s, submitted: 230
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:16.384590+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:17.384994+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840966 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:18.385259+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:19.385485+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:20.385649+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:21.385798+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:22.385942+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:23.386086+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:24.386326+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75464704 unmapped: 2088960 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:25.386558+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75464704 unmapped: 2088960 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:26.386904+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75464704 unmapped: 2088960 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:27.387224+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:28.387541+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:29.387896+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:30.388181+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:31.388521+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:32.388798+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:33.388982+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:34.389162+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:35.389299+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:36.389488+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:37.389632+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:38.389797+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:39.389984+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:40.390219+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:41.390383+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:42.390515+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x559226afa000 session 0x5592254723c0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:43.390686+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:44.390863+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:45.390990+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:46.391191+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:47.391387+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:48.391603+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75489280 unmapped: 2064384 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:49.391785+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75489280 unmapped: 2064384 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:50.391907+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75489280 unmapped: 2064384 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:51.392099+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:52.392315+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:53.392508+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:54.392694+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:55.392773+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:56.393040+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:57.393212+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 41.259738922s of 41.266269684s, submitted: 3
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 841887 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:58.393392+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:59.393599+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:00.393799+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226728800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:01.393995+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:02.394211+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 841887 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:03.394384+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:04.394525+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:05.394648+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:06.394978+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:07.395208+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:08.395379+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:09.395582+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:10.395780+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:11.395953+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:12.396121+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:13.396297+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:14.396492+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:15.396609+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:16.396772+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:17.396930+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:18.397070+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:19.397190+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:20.397319+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:21.397471+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:22.397627+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:23.397789+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:24.397927+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:25.398053+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:26.398222+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:27.398389+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:28.398596+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:29.398826+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:30.399670+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:31.399869+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:32.400038+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:33.400203+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:34.400400+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:35.400538+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:36.400703+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:37.400917+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:38.401059+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:39.401270+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:40.401416+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:41.402515+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:42.403469+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:43.404163+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:44.405659+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:45.405855+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:46.406203+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:47.406661+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:48.407282+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:49.407510+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:50.407699+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:51.407884+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:52.408231+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:53.408676+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:54.408841+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:55.409013+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:56.409293+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:57.409529+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:58.409949+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:59.410211+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:00.410388+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:01.410654+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:02.410879+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:03.411228+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:04.411572+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:05.411791+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:06.412084+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:07.412259+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:08.412506+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:09.412775+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:10.412892+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:11.413049+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:12.413273+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:13.413476+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:14.413712+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:15.414012+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:16.414273+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:17.414435+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:18.414625+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:19.414837+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:20.414976+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:21.415179+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:22.415378+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:23.415586+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:24.415800+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:25.416036+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:26.416416+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:27.416596+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:28.416778+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:29.417101+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:30.417339+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:31.417559+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:32.417710+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:33.417859+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:34.418069+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:35.418234+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:36.418463+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:37.418613+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:38.418801+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:39.418959+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:40.419192+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:41.419440+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:42.419621+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:43.419826+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:44.419994+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:45.420148+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:46.420319+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:47.420449+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1982464 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:48.420594+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1982464 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:49.420795+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1982464 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:50.421018+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75579392 unmapped: 1974272 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:51.421186+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:52.421355+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:53.421529+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:54.421674+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:55.421858+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:56.422124+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:57.422303+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:58.422449+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:59.422613+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:00.422789+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:01.422954+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:02.423121+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:03.423295+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:04.423485+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:05.423643+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:06.423804+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:07.423974+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af8800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 handle_osd_map epochs [138,139], i have 138, src has [1,139]
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 130.189498901s of 130.569747925s, submitted: 3
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _renew_subs
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 138 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75636736 unmapped: 1916928 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:08.424117+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846191 data_alloc: 218103808 data_used: 40960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 139 handle_osd_map epochs [139,140], i have 139, src has [1,140]
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _renew_subs
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 140 handle_osd_map epochs [140,140], i have 140, src has [1,140]
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 140 heartbeat osd_stat(store_statfs(0x4fca7a000/0x0/0x4ffc00000, data 0xed7f2/0x1a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 76718080 unmapped: 835584 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:09.424270+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 16392192 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:10.424449+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _renew_subs
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 140 handle_osd_map epochs [140,141], i have 140, src has [1,141]
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 140 handle_osd_map epochs [141,141], i have 141, src has [1,141]
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 141 ms_handle_reset con 0x559226af8800 session 0x559227226780
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fba75000/0x0/0x4ffc00000, data 0x10ef970/0x11a6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 77979648 unmapped: 16359424 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559224eeb000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fba75000/0x0/0x4ffc00000, data 0x10ef970/0x11a6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:11.424623+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78102528 unmapped: 16236544 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:12.424796+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 141 handle_osd_map epochs [141,142], i have 141, src has [1,142]
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 142 ms_handle_reset con 0x559224eeb000 session 0x559227226d20
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78127104 unmapped: 16211968 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:13.424997+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967505 data_alloc: 218103808 data_used: 45056
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78127104 unmapped: 16211968 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:14.425136+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78127104 unmapped: 16211968 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:15.425336+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 142 handle_osd_map epochs [142,143], i have 142, src has [1,143]
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6d000/0x0/0x4ffc00000, data 0x10f3bc6/0x11ae000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:16.425553+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:17.425842+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:18.426486+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969935 data_alloc: 218103808 data_used: 45056
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:19.426917+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:20.427312+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:21.427599+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:22.427787+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:23.428378+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969935 data_alloc: 218103808 data_used: 45056
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:24.428906+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:25.429343+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:26.429828+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:27.430251+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226b04c00 session 0x559226feeb40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:28.430360+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969935 data_alloc: 218103808 data_used: 45056
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:29.430512+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:30.430677+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:31.430796+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:32.430988+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:33.431361+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969935 data_alloc: 218103808 data_used: 45056
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:34.431850+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:35.432102+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:36.432347+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:37.432689+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:38.433058+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226636c00 session 0x55922721e780
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969935 data_alloc: 218103808 data_used: 45056
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:39.433406+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:40.433653+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:41.433810+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:42.434150+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 33.924980164s of 34.470951080s, submitted: 51
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:43.434459+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971447 data_alloc: 218103808 data_used: 45056
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:44.434763+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226728800 session 0x55922677e960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:45.434866+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:46.435047+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:47.435246+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:48.435454+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971447 data_alloc: 218103808 data_used: 45056
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af8400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226af8400 session 0x559227227e00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e8400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x5592261e8400 session 0x55922723c5a0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226634000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226634000 session 0x55922723c780
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:49.435594+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78168064 unmapped: 16171008 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:50.435789+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afcc00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78168064 unmapped: 16171008 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226afcc00 session 0x55922723c960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afcc00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226afcc00 session 0x55922723cd20
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:51.436177+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 92913664 unmapped: 1425408 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af6400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226af6400 session 0x55922723cf00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e1000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:52.436358+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 92938240 unmapped: 1400832 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 144 handle_osd_map epochs [144,145], i have 144, src has [1,145]
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.170117378s of 10.184672356s, submitted: 3
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 145 heartbeat osd_stat(store_statfs(0x4fba67000/0x0/0x4ffc00000, data 0x10f7c84/0x11b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,7])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 145 ms_handle_reset con 0x5592261e1000 session 0x55922723d0e0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:53.436587+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93839360 unmapped: 17342464 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123797 data_alloc: 234881024 data_used: 13676544
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e1800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 145 ms_handle_reset con 0x5592261e1800 session 0x55922723da40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:54.436766+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93863936 unmapped: 17317888 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:55.436938+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b02400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93855744 unmapped: 17326080 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e7c00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 145 ms_handle_reset con 0x5592261e7c00 session 0x559226f24f00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:56.437185+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93937664 unmapped: 17244160 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:57.437344+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93937664 unmapped: 17244160 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e1000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 145 ms_handle_reset con 0x5592261e1000 session 0x559226f24960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e1800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 145 heartbeat osd_stat(store_statfs(0x4fac33000/0x0/0x4ffc00000, data 0x1f29dc4/0x1fe7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 145 ms_handle_reset con 0x5592261e1800 session 0x55922657f860
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:58.437552+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93700096 unmapped: 17481728 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1128704 data_alloc: 234881024 data_used: 13676544
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af6400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afcc00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:59.437817+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93724672 unmapped: 17457152 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:00.437947+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 99999744 unmapped: 11182080 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:01.438147+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105676800 unmapped: 5505024 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b08000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:02.438561+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105676800 unmapped: 5505024 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fac0c000/0x0/0x4ffc00000, data 0x1f4fda5/0x200f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e1c00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.482107162s of 10.092863083s, submitted: 62
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:03.438798+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1227418 data_alloc: 234881024 data_used: 25862144
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:04.439046+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:05.439224+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fac0c000/0x0/0x4ffc00000, data 0x1f4fda5/0x200f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:06.439499+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fac0c000/0x0/0x4ffc00000, data 0x1f4fda5/0x200f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:07.439699+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:08.439939+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225987 data_alloc: 234881024 data_used: 25862144
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:09.440121+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:10.440312+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:11.440488+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109395968 unmapped: 3883008 heap: 113278976 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa78e000/0x0/0x4ffc00000, data 0x23ceda5/0x248e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:12.440668+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108773376 unmapped: 9756672 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.997505188s of 10.222607613s, submitted: 78
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9eb8000/0x0/0x4ffc00000, data 0x2ca4da5/0x2d64000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:13.440848+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111665152 unmapped: 6864896 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1356813 data_alloc: 251658240 data_used: 27123712
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:14.441064+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111968256 unmapped: 6561792 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:15.441286+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 6553600 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c6c000/0x0/0x4ffc00000, data 0x2d4fda5/0x2e0f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:16.441596+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112009216 unmapped: 6520832 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:17.441783+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112009216 unmapped: 6520832 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:18.442026+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112009216 unmapped: 6520832 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1357269 data_alloc: 251658240 data_used: 27136000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:19.442173+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112156672 unmapped: 6373376 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c49000/0x0/0x4ffc00000, data 0x2d73da5/0x2e33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:20.442397+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112156672 unmapped: 6373376 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:21.442629+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112173056 unmapped: 6356992 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:22.442851+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112173056 unmapped: 6356992 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:23.443081+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c49000/0x0/0x4ffc00000, data 0x2d73da5/0x2e33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112181248 unmapped: 6348800 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1354717 data_alloc: 251658240 data_used: 27205632
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:24.443284+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112181248 unmapped: 6348800 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:25.443498+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112181248 unmapped: 6348800 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:26.443772+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112181248 unmapped: 6348800 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.461258888s of 13.723365784s, submitted: 31
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:27.444036+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112353280 unmapped: 6176768 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c40000/0x0/0x4ffc00000, data 0x2d7cda5/0x2e3c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:28.444287+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 6553600 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1354877 data_alloc: 251658240 data_used: 27205632
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c3f000/0x0/0x4ffc00000, data 0x2d7dda5/0x2e3d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:29.444528+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 6553600 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:30.444832+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 6553600 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:31.445061+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 6553600 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:32.445277+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 6553600 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:33.445488+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111984640 unmapped: 6545408 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1354877 data_alloc: 251658240 data_used: 27205632
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:34.445696+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111984640 unmapped: 6545408 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c3f000/0x0/0x4ffc00000, data 0x2d7dda5/0x2e3d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:35.445840+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111984640 unmapped: 6545408 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:36.446144+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111984640 unmapped: 6545408 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:37.446452+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111984640 unmapped: 6545408 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:38.446667+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261ec800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ec800 session 0x55922723d4a0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af7c00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af7c00 session 0x55922723cb40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e6000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e6000 session 0x55922723dc20
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111984640 unmapped: 6545408 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1354877 data_alloc: 251658240 data_used: 27205632
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:39.446896+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e1000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e1000 session 0x5592254712c0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e1800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114376704 unmapped: 4153344 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e1800 session 0x559225471a40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c3f000/0x0/0x4ffc00000, data 0x2d7dda5/0x2e3d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:40.447079+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114376704 unmapped: 4153344 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261ec800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ec800 session 0x559225624f00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af7c00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.490158081s of 14.116064072s, submitted: 3
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:41.447267+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af7c00 session 0x559223b87e00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115531776 unmapped: 13500416 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:42.447492+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115531776 unmapped: 13500416 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:43.447755+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b6000/0x0/0x4ffc00000, data 0x3506da5/0x35c6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115564544 unmapped: 13467648 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1411121 data_alloc: 251658240 data_used: 29302784
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:44.447934+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115564544 unmapped: 13467648 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:45.448126+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115564544 unmapped: 13467648 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b4000/0x0/0x4ffc00000, data 0x3507da5/0x35c7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:46.448352+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115597312 unmapped: 13434880 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:47.448580+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115597312 unmapped: 13434880 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:48.448812+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115597312 unmapped: 13434880 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1411257 data_alloc: 251658240 data_used: 29302784
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:49.448970+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e2000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e2000 session 0x55922546f0e0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:50.449136+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b4000/0x0/0x4ffc00000, data 0x3507da5/0x35c7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:51.449304+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:52.449497+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:53.449709+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1411409 data_alloc: 251658240 data_used: 29306880
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:54.449901+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afa400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afa400 session 0x5592247d8960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b4000/0x0/0x4ffc00000, data 0x3507da5/0x35c7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:55.450111+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:56.450323+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248f2400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f2400 session 0x55922721fa40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248f3400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.360092163s of 15.706788063s, submitted: 14
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f3400 session 0x55922721fc20
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afd400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:57.450460+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115613696 unmapped: 13418496 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922549ec00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:58.450609+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120020992 unmapped: 9011200 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1456479 data_alloc: 251658240 data_used: 33521664
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:59.450757+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 8978432 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b3000/0x0/0x4ffc00000, data 0x3507db5/0x35c8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:00.450900+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120086528 unmapped: 8945664 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:01.451038+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120086528 unmapped: 8945664 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b3000/0x0/0x4ffc00000, data 0x3507db5/0x35c8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:02.451192+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120119296 unmapped: 8912896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:03.451343+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120119296 unmapped: 8912896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1456615 data_alloc: 251658240 data_used: 33521664
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:04.451552+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120119296 unmapped: 8912896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:05.451714+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b1000/0x0/0x4ffc00000, data 0x350adb5/0x35cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120119296 unmapped: 8912896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:06.451940+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120119296 unmapped: 8912896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:07.452095+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120119296 unmapped: 8912896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:08.452251+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b1000/0x0/0x4ffc00000, data 0x350adb5/0x35cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119701504 unmapped: 9330688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.960206985s of 12.248162270s, submitted: 5
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1457327 data_alloc: 251658240 data_used: 33529856
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:09.452384+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119889920 unmapped: 9142272 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:10.452545+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120283136 unmapped: 8749056 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:11.452894+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 7888896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:12.453040+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 7888896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f80f2000/0x0/0x4ffc00000, data 0x38bbdb5/0x397c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:13.453209+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 7856128 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1500921 data_alloc: 251658240 data_used: 33931264
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:14.453412+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 7856128 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:15.455497+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 7856128 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:16.455711+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120578048 unmapped: 8454144 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:17.455901+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120578048 unmapped: 8454144 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f80f8000/0x0/0x4ffc00000, data 0x38c3db5/0x3984000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:18.456081+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f80f8000/0x0/0x4ffc00000, data 0x38c3db5/0x3984000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120578048 unmapped: 8454144 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549ec00 session 0x55922669fe00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afd400 session 0x55922721e960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f80f8000/0x0/0x4ffc00000, data 0x38c3db5/0x3984000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1495881 data_alloc: 251658240 data_used: 33931264
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248f2400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.955549240s of 10.352662086s, submitted: 63
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:19.456260+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f2400 session 0x55922723c960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:20.456427+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:21.456589+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:22.456768+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:23.456942+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c2f000/0x0/0x4ffc00000, data 0x2d8dda5/0x2e4d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1356198 data_alloc: 234881024 data_used: 25632768
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:24.457189+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:25.457535+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:26.457896+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:27.458093+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c2f000/0x0/0x4ffc00000, data 0x2d8dda5/0x2e4d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:28.458387+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1356198 data_alloc: 234881024 data_used: 25632768
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af6400 session 0x5592267a63c0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.853686333s of 10.125116348s, submitted: 14
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afcc00 session 0x5592272292c0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:29.458971+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248f2800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108109824 unmapped: 20922368 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f2800 session 0x55922723d4a0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:30.459539+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:31.460010+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:32.460178+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b08000 session 0x559227227a40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:33.460359+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051233 data_alloc: 234881024 data_used: 15777792
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:34.460581+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:35.460773+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:36.461009+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:37.461175+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:38.461342+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051233 data_alloc: 234881024 data_used: 15777792
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:39.461936+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:40.462161+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:41.462392+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:42.462553+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:43.462784+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051233 data_alloc: 234881024 data_used: 15777792
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:44.462978+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:45.463152+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:46.463355+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.126308441s of 17.207635880s, submitted: 34
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109289472 unmapped: 19742720 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:47.463506+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109289472 unmapped: 19742720 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:48.463759+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109289472 unmapped: 19742720 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1054257 data_alloc: 234881024 data_used: 15777792
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:49.463919+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261eb400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109289472 unmapped: 19742720 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:50.464087+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109289472 unmapped: 19742720 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:51.464237+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109289472 unmapped: 19742720 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:52.464448+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109297664 unmapped: 19734528 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:53.464611+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e9800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e9800 session 0x559226feed20
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922676f400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108732416 unmapped: 20299776 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1053666 data_alloc: 234881024 data_used: 14729216
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676f400 session 0x559226784d20
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:54.464782+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108339200 unmapped: 20692992 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afb000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:55.464966+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c1000/0x0/0x4ffc00000, data 0x10fbdbf/0x11bb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108339200 unmapped: 20692992 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afb000 session 0x55922721eb40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b00000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b00000 session 0x55922669f0e0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:56.465233+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108707840 unmapped: 24526848 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afbc00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:57.465429+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afbc00 session 0x5592254eeb40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108707840 unmapped: 24526848 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:58.465621+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108707840 unmapped: 24526848 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e9800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e9800 session 0x55922721fa40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1132656 data_alloc: 234881024 data_used: 14729216
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:59.465818+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922676f400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676f400 session 0x55922721fc20
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afb000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.649352074s of 13.054588318s, submitted: 40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108716032 unmapped: 24518656 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9f02000/0x0/0x4ffc00000, data 0x1abadf8/0x1b7a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:00.465937+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afb000 session 0x55922721e960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109027328 unmapped: 24207360 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b00000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226634800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:01.466140+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109027328 unmapped: 24207360 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:02.466313+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9ede000/0x0/0x4ffc00000, data 0x1adedf8/0x1b9e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:03.466552+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1194396 data_alloc: 234881024 data_used: 20054016
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:04.466819+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:05.467060+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9ede000/0x0/0x4ffc00000, data 0x1adedf8/0x1b9e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:06.467350+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:07.467569+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:08.467787+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1194396 data_alloc: 234881024 data_used: 20054016
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:09.468034+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:10.468331+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9ede000/0x0/0x4ffc00000, data 0x1adedf8/0x1b9e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110280704 unmapped: 22953984 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:11.468548+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110280704 unmapped: 22953984 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:12.468867+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.911570549s of 12.916566849s, submitted: 2
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261eb400 session 0x559224554960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114982912 unmapped: 18251776 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:13.468993+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115589120 unmapped: 17645568 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1292618 data_alloc: 234881024 data_used: 21278720
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:14.469119+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b6000/0x0/0x4ffc00000, data 0x24fddf8/0x25bd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 17416192 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:15.469273+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113991680 unmapped: 19243008 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:16.469501+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113991680 unmapped: 19243008 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:17.469655+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:18.469824+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301298 data_alloc: 234881024 data_used: 21491712
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:19.469983+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:20.470141+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:21.470360+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:22.470562+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:23.470758+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301618 data_alloc: 234881024 data_used: 21499904
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:24.470950+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:25.471124+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:26.471372+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.075481415s of 14.293769836s, submitted: 87
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:27.471521+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:28.471704+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301498 data_alloc: 234881024 data_used: 21504000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:29.471883+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:30.472058+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:31.472285+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:32.472623+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113147904 unmapped: 20086784 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:33.472813+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113147904 unmapped: 20086784 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1300148 data_alloc: 234881024 data_used: 21504000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:34.472935+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113147904 unmapped: 20086784 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:35.473184+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 20004864 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:36.473458+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 20004864 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:37.473688+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 20004864 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:38.473934+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 20004864 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301820 data_alloc: 234881024 data_used: 21557248
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:39.474198+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 20004864 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:40.474377+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.004294395s of 14.015699387s, submitted: 3
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b00000 session 0x55922721ef00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 20004864 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:41.474542+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226634800 session 0x5592267a63c0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559223a27c00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113238016 unmapped: 19996672 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:42.474703+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113238016 unmapped: 19996672 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:43.474880+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa16e000/0x0/0x4ffc00000, data 0x112cdf8/0x11ec000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1076098 data_alloc: 234881024 data_used: 10645504
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:44.475062+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x55922723c780
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:45.475236+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:46.475464+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:47.475664+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:48.475862+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1070046 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:49.476063+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:50.476253+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:51.476446+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:52.476620+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:53.476840+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1070046 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:54.477044+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:55.477187+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:56.477372+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:57.477569+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:58.477823+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1070046 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:59.478019+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:00.478237+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:01.478442+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:02.478590+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:03.478794+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1070046 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:04.478956+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:05.479109+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:06.479329+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:07.479505+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:08.479712+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1070046 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:09.479943+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:10.480106+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afa800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afa800 session 0x559226ffa960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922676ec00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676ec00 session 0x559226ffa780
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af6400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af6400 session 0x559226ffab40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559223a27c00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x559226ffb680
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226634800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 30.319431305s of 30.379514694s, submitted: 28
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107036672 unmapped: 26198016 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:11.480248+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107077632 unmapped: 26157056 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:12.480390+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226634800 session 0x559226ffa5a0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922676e000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676e000 session 0x55922721f4a0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b06c00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b06c00 session 0x55922721e5a0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e4400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e4400 session 0x55922721e960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559223a27c00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x559226ffb860
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 26214400 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:13.480548+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 26214400 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:14.480691+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1117062 data_alloc: 234881024 data_used: 10539008
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3ce000/0x0/0x4ffc00000, data 0x15ede08/0x16ae000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 26214400 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:15.480935+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3ce000/0x0/0x4ffc00000, data 0x15ede08/0x16ae000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 26214400 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:16.481092+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b01c00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b01c00 session 0x5592247d9860
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 26214400 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:17.481258+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af4800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4800 session 0x5592267850e0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 26214400 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:18.481437+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afc400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afc400 session 0x559226784780
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b06400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107028480 unmapped: 26206208 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:19.481583+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1117924 data_alloc: 234881024 data_used: 10539008
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b06400 session 0x559226784960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559223a27c00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af4800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3cd000/0x0/0x4ffc00000, data 0x15ede18/0x16af000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107036672 unmapped: 26198016 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:20.481760+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107036672 unmapped: 26198016 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:21.481925+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:22.482079+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3cd000/0x0/0x4ffc00000, data 0x15ede18/0x16af000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:23.482200+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3cd000/0x0/0x4ffc00000, data 0x15ede18/0x16af000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:24.482329+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1152408 data_alloc: 234881024 data_used: 15626240
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:25.482517+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:26.482820+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:27.482969+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3cd000/0x0/0x4ffc00000, data 0x15ede18/0x16af000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:28.483135+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:29.483275+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1152408 data_alloc: 234881024 data_used: 15626240
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:30.483449+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.764934540s of 19.582212448s, submitted: 45
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:31.483645+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:32.483821+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3cd000/0x0/0x4ffc00000, data 0x15ede18/0x16af000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,0,0,1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:33.484008+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:34.484175+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1152956 data_alloc: 234881024 data_used: 15638528
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:35.484364+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency slow operation observed for kv_commit, latency = 5.380156040s
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency slow operation observed for kv_sync, latency = 5.380156517s
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency_fn slow operation observed for _txc_committed_kv, latency = 5.380500793s, txc = 0x559226356c00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114221056 unmapped: 19013632 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:36.484567+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113123328 unmapped: 20111360 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:37.484763+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f963c000/0x0/0x4ffc00000, data 0x237ee18/0x2440000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,11])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114171904 unmapped: 19062784 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:38.484924+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114221056 unmapped: 19013632 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:39.485087+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f963c000/0x0/0x4ffc00000, data 0x237ee18/0x2440000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253386 data_alloc: 234881024 data_used: 16334848
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:40.485221+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 20078592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:41.485369+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 20078592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f960a000/0x0/0x4ffc00000, data 0x23b0e18/0x2472000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:42.486002+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 20078592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:43.486254+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 20078592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f960a000/0x0/0x4ffc00000, data 0x23b0e18/0x2472000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:44.486472+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 20078592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1263652 data_alloc: 234881024 data_used: 16482304
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:45.486647+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 20078592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 6.865746021s of 15.085161209s, submitted: 113
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:46.486873+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112640000 unmapped: 20594688 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:47.487020+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112640000 unmapped: 20594688 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:48.487306+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112640000 unmapped: 20594688 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9607000/0x0/0x4ffc00000, data 0x23b3e18/0x2475000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112648192 unmapped: 20586496 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:49.570995+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1261276 data_alloc: 234881024 data_used: 16486400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:50.571150+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 20578304 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:51.571353+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 20578304 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:52.571638+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 20578304 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:53.571911+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 20578304 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x23b4e18/0x2476000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:54.572101+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 20578304 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1261500 data_alloc: 234881024 data_used: 16486400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:55.572334+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112664576 unmapped: 20570112 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afc000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afc000 session 0x559224ef1a40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af9800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af9800 session 0x559224ef03c0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b02c00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b02c00 session 0x55922721f680
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248f2400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f2400 session 0x55922721f4a0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af9c00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x23b4e18/0x2476000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:56.572604+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112664576 unmapped: 20570112 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.364326477s of 10.769536018s, submitted: 4
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9605000/0x0/0x4ffc00000, data 0x23b4e28/0x2477000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:57.572758+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112672768 unmapped: 20561920 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:58.572944+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112672768 unmapped: 20561920 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af9c00 session 0x559227226960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248f2400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f2400 session 0x559226ffa1e0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af9800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af9800 session 0x55922723d680
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afc000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:59.573079+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112902144 unmapped: 23486464 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1329277 data_alloc: 234881024 data_used: 16490496
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afc000 session 0x5592265523c0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b02c00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b02c00 session 0x55922669e1e0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:00.573198+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112574464 unmapped: 23814144 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922483b000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922483b000 session 0x559226785c20
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:01.573343+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112574464 unmapped: 23814144 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8e38000/0x0/0x4ffc00000, data 0x2b81e28/0x2c44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afdc00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afdc00 session 0x5592247d8d20
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:02.573620+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d4800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112574464 unmapped: 23814144 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x5592270fc5a0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922676e800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676e800 session 0x559226fef680
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8e38000/0x0/0x4ffc00000, data 0x2b81e28/0x2c44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e3800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:03.573778+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b03800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112599040 unmapped: 23789568 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:04.573918+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113762304 unmapped: 22626304 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1352731 data_alloc: 234881024 data_used: 19812352
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:05.574064+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119595008 unmapped: 16793600 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:06.574266+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119595008 unmapped: 16793600 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:07.574446+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119595008 unmapped: 16793600 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:08.574622+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119595008 unmapped: 16793600 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8e37000/0x0/0x4ffc00000, data 0x2b81e38/0x2c45000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:09.574833+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119627776 unmapped: 16760832 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1383891 data_alloc: 234881024 data_used: 24457216
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:10.575012+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119627776 unmapped: 16760832 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8e37000/0x0/0x4ffc00000, data 0x2b81e38/0x2c45000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.307135582s of 14.417451859s, submitted: 28
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:11.575130+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119652352 unmapped: 16736256 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:12.575364+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119652352 unmapped: 16736256 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:13.575510+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119652352 unmapped: 16736256 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:14.575673+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119652352 unmapped: 16736256 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1384819 data_alloc: 234881024 data_used: 24469504
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:15.575807+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8e36000/0x0/0x4ffc00000, data 0x2b81e38/0x2c45000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121552896 unmapped: 14835712 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:16.576026+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121905152 unmapped: 14483456 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:17.576190+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123551744 unmapped: 12836864 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:18.576361+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123551744 unmapped: 12836864 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8723000/0x0/0x4ffc00000, data 0x3295e38/0x3359000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:19.576601+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123592704 unmapped: 12795904 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1442267 data_alloc: 234881024 data_used: 24694784
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:20.576839+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123592704 unmapped: 12795904 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:21.576980+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123592704 unmapped: 12795904 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.862577438s of 11.139899254s, submitted: 73
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:22.577111+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123625472 unmapped: 12763136 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8704000/0x0/0x4ffc00000, data 0x32b4e38/0x3378000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:23.577286+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123658240 unmapped: 12730368 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:24.577437+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123658240 unmapped: 12730368 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1440939 data_alloc: 234881024 data_used: 24694784
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8704000/0x0/0x4ffc00000, data 0x32b4e38/0x3378000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e3800 session 0x5592272270e0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:25.577565+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b03800 session 0x559226ffab40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 12722176 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d4800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8704000/0x0/0x4ffc00000, data 0x32b4e38/0x3378000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,4])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x5592255661e0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:26.577864+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 18513920 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:27.578002+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 18513920 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:28.578113+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 18513920 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:29.578239+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 18513920 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1273528 data_alloc: 234881024 data_used: 16486400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x559226f252c0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4800 session 0x5592247d94a0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b01400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:30.578383+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 18513920 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:31.578568+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 18513920 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x23b4e18/0x2476000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b01400 session 0x5592263fe000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:32.578771+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:33.578949+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:34.579163+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1097099 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:35.579309+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:36.579460+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:37.579623+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:38.579816+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.934545517s of 17.322147369s, submitted: 73
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:39.579982+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1098611 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:40.580134+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:41.580629+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:42.580908+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:43.581625+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:44.581955+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1098611 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:45.582094+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:46.582349+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:47.582864+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:48.583006+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:49.583323+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1097728 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:50.583495+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:51.583752+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:52.583961+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:53.584132+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:54.584286+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1097728 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:55.584421+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:56.584566+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:57.584903+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:58.585064+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:59.585249+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1097728 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:00.585395+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:01.585520+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:02.585650+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559224eeb000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559224eeb000 session 0x559225566960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559223a27c00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x55922657e960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d4800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x55922657f4a0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559224eeb000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559224eeb000 session 0x55922546e960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:03.585789+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af4800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 23.590827942s of 24.071311951s, submitted: 2
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:04.585972+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1099528 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 23085056 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:05.586138+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9e3a000/0x0/0x4ffc00000, data 0x1b82da6/0x1c42000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,6,11])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125083648 unmapped: 22855680 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9e3a000/0x0/0x4ffc00000, data 0x1b82da6/0x1c42000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,17])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:06.586318+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125083648 unmapped: 22855680 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:07.586481+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 29122560 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4800 session 0x55922546ef00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e1800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e1800 session 0x5592267a72c0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559223a27c00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x5592267a6000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d4800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x5592263ffe00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559224eeb000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559224eeb000 session 0x5592263feb40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:08.586697+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114098176 unmapped: 33841152 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:09.586870+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1187056 data_alloc: 234881024 data_used: 10539008
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114098176 unmapped: 33841152 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:10.587007+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af4800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4800 session 0x5592263fef00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114098176 unmapped: 33841152 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:11.587145+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226632000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632000 session 0x559224742000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114098176 unmapped: 33841152 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f99a9000/0x0/0x4ffc00000, data 0x1c03da6/0x1cc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559223a27c00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x559226f25680
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d4800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:12.587282+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 33513472 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x559226f25e00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:13.587429+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afd000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e8400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 33513472 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9984000/0x0/0x4ffc00000, data 0x1c27dc9/0x1ce8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:14.587560+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1233177 data_alloc: 234881024 data_used: 16625664
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114819072 unmapped: 33120256 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:15.587749+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:16.587901+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:17.588031+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:18.588160+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9984000/0x0/0x4ffc00000, data 0x1c27dc9/0x1ce8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:19.588282+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268441 data_alloc: 234881024 data_used: 21921792
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:20.588426+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9984000/0x0/0x4ffc00000, data 0x1c27dc9/0x1ce8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:21.588547+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:22.588688+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e1c00 session 0x55922546e5a0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:23.588875+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:24.589051+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268897 data_alloc: 234881024 data_used: 21934080
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9984000/0x0/0x4ffc00000, data 0x1c27dc9/0x1ce8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.883409500s of 21.952882767s, submitted: 22
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:25.589195+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126459904 unmapped: 21479424 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:26.589371+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c8d000/0x0/0x4ffc00000, data 0x2916dc9/0x29d7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,5])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 124387328 unmapped: 23552000 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:27.589580+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c6a000/0x0/0x4ffc00000, data 0x2939dc9/0x29fa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 124616704 unmapped: 23322624 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:28.589801+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:29.589961+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1372079 data_alloc: 234881024 data_used: 22802432
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:30.590091+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:31.590292+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.1 total, 600.0 interval
                                           Cumulative writes: 7795 writes, 32K keys, 7795 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s
                                           Cumulative WAL: 7795 writes, 1759 syncs, 4.43 writes per sync, written: 0.03 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1988 writes, 7632 keys, 1988 commit groups, 1.0 writes per commit group, ingest: 8.26 MB, 0.01 MB/s
                                           Interval WAL: 1988 writes, 772 syncs, 2.58 writes per sync, written: 0.01 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:32.590469+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:33.590608+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:34.590818+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1372095 data_alloc: 234881024 data_used: 22802432
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:35.590957+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 25436160 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:36.591171+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 25436160 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:37.591325+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 25436160 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:38.591482+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 25436160 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:39.591634+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1372095 data_alloc: 234881024 data_used: 22802432
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122511360 unmapped: 25427968 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:40.591831+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b05400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122511360 unmapped: 25427968 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:41.591963+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122511360 unmapped: 25427968 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:42.592113+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122511360 unmapped: 25427968 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:43.592309+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122519552 unmapped: 25419776 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:44.592510+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1372247 data_alloc: 234881024 data_used: 22806528
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122519552 unmapped: 25419776 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:45.592657+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122519552 unmapped: 25419776 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:46.592841+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.538951874s of 21.155471802s, submitted: 103
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122519552 unmapped: 25419776 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:47.593101+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122519552 unmapped: 25419776 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:48.593284+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122527744 unmapped: 25411584 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:49.593477+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1370448 data_alloc: 234881024 data_used: 22806528
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122527744 unmapped: 25411584 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:50.593596+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122527744 unmapped: 25411584 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:51.593786+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122527744 unmapped: 25411584 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:52.593958+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122527744 unmapped: 25411584 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:53.594119+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248f1400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,4,0,6])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 141074432 unmapped: 15261696 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f1400 session 0x559225017c20
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:54.594261+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1500262 data_alloc: 234881024 data_used: 22806528
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122871808 unmapped: 33464320 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:55.594397+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122871808 unmapped: 33464320 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:56.594586+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f79ef000/0x0/0x4ffc00000, data 0x3bbcdc9/0x3c7d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122871808 unmapped: 33464320 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b08c00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b08c00 session 0x559225473c20
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:57.594795+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f79ef000/0x0/0x4ffc00000, data 0x3bbcdc9/0x3c7d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122871808 unmapped: 33464320 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b02800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b02800 session 0x55922721ef00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:58.594970+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.648444176s of 12.136721611s, submitted: 17
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b03000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b03000 session 0x559226552960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af4400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4400 session 0x55922546fc20
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122544128 unmapped: 33792000 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:59.595115+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b09000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248ef000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1502946 data_alloc: 234881024 data_used: 22810624
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122544128 unmapped: 33792000 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:00.595263+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135053312 unmapped: 21282816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549f800 session 0x559225470b40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b06000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f79cb000/0x0/0x4ffc00000, data 0x3be0dc9/0x3ca1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:01.595383+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139042816 unmapped: 17293312 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:02.595584+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139042816 unmapped: 17293312 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:03.595761+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139042816 unmapped: 17293312 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:04.595929+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1625513 data_alloc: 251658240 data_used: 41050112
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139075584 unmapped: 17260544 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:05.596156+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139149312 unmapped: 17186816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:06.596445+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139288576 unmapped: 17047552 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f79cb000/0x0/0x4ffc00000, data 0x3be0dc9/0x3ca1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:07.596709+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139337728 unmapped: 16998400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:08.596947+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139337728 unmapped: 16998400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:09.597132+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f79cb000/0x0/0x4ffc00000, data 0x3be0dc9/0x3ca1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1625426 data_alloc: 251658240 data_used: 41050112
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139337728 unmapped: 16998400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:10.597299+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139460608 unmapped: 16875520 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.569581985s of 12.389707565s, submitted: 232
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:11.597497+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142147584 unmapped: 14188544 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:12.597655+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f7566000/0x0/0x4ffc00000, data 0x4044dc9/0x4105000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142508032 unmapped: 13828096 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:13.597798+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142508032 unmapped: 13828096 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:14.597927+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1678578 data_alloc: 251658240 data_used: 42459136
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142508032 unmapped: 13828096 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:15.598096+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b05400 session 0x5592250174a0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142508032 unmapped: 13828096 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:16.598337+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142508032 unmapped: 13828096 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f7562000/0x0/0x4ffc00000, data 0x4048dc9/0x4109000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:17.598523+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142508032 unmapped: 13828096 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:18.598710+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142516224 unmapped: 13819904 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:19.604273+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1678578 data_alloc: 251658240 data_used: 42459136
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142516224 unmapped: 13819904 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:20.604471+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f7562000/0x0/0x4ffc00000, data 0x4048dc9/0x4109000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142524416 unmapped: 13811712 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:21.604609+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142524416 unmapped: 13811712 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:22.604766+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142524416 unmapped: 13811712 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:23.605050+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142532608 unmapped: 13803520 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:24.605230+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1678578 data_alloc: 251658240 data_used: 42459136
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142532608 unmapped: 13803520 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:25.605425+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f7562000/0x0/0x4ffc00000, data 0x4048dc9/0x4109000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142532608 unmapped: 13803520 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b09000 session 0x55922669e780
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248ef000 session 0x559225566960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:26.605599+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af4400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.538358688s of 15.629971504s, submitted: 40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4400 session 0x5592255ff0e0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129097728 unmapped: 27238400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:27.605844+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129097728 unmapped: 27238400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:28.606031+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129097728 unmapped: 27238400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:29.606228+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1381072 data_alloc: 234881024 data_used: 22806528
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129097728 unmapped: 27238400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:30.606398+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129097728 unmapped: 27238400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:31.606559+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129097728 unmapped: 27238400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:32.606821+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e4800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:33.607076+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:34.607445+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1382584 data_alloc: 234881024 data_used: 22806528
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:35.607654+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:36.607920+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:37.608174+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:38.608470+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:39.608602+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1382584 data_alloc: 234881024 data_used: 22806528
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:40.608860+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afd000 session 0x559226f24960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e8400 session 0x55922669e3c0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226636c00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.595676422s of 14.635678291s, submitted: 13
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:41.609055+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226636c00 session 0x559226553e00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:42.609220+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:43.609430+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:44.609627+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123508 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:45.609774+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:46.609998+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:47.610142+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:48.610615+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:49.610940+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123508 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:50.611062+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:51.611354+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:52.611698+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:53.612034+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:54.612185+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123508 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:55.612346+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:56.612654+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:57.612975+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:58.613123+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b02400 session 0x55922723d860
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:59.613368+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123508 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:00.613628+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:02.109471+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:03.109819+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:04.110070+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:05.110265+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123508 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:06.110462+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:07.110850+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:08.111101+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:09.111375+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:10.111622+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123508 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:11.111910+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:12.112149+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261ecc00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ecc00 session 0x5592267a63c0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248f3400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f3400 session 0x55922721e5a0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261ec800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ec800 session 0x55922721fa40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226728800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226728800 session 0x559225016780
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261eb000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 30.576278687s of 31.081003189s, submitted: 24
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261eb000 session 0x559226ffb860
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922549dc00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x5592263fe1e0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248f1800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f1800 session 0x559226f25680
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261edc00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261edc00 session 0x559226f250e0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d4400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4400 session 0x559226f25860
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:13.112349+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 37027840 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:14.112670+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 37027840 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:15.112843+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248f1800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f1800 session 0x559225471860
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1194678 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922549dc00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119324672 unmapped: 37011456 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261eb000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261eb000 session 0x559225472b40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:16.113047+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261edc00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261edc00 session 0x559226f252c0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afa800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afa800 session 0x559226f241e0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119480320 unmapped: 36855808 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9c40000/0x0/0x4ffc00000, data 0x196cdf8/0x1a2c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e1000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af4800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:17.113271+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119496704 unmapped: 36839424 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:18.113395+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119496704 unmapped: 36839424 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:19.113569+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9c1b000/0x0/0x4ffc00000, data 0x1990e08/0x1a51000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119955456 unmapped: 36380672 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:20.113699+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1260962 data_alloc: 234881024 data_used: 19304448
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119955456 unmapped: 36380672 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:21.114824+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119955456 unmapped: 36380672 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:22.115267+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119808000 unmapped: 36528128 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.900426865s of 10.097883224s, submitted: 55
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e1000 session 0x559227226b40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4800 session 0x559224742b40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248f1800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:23.115824+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f1800 session 0x55922669e780
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9c1b000/0x0/0x4ffc00000, data 0x1990e08/0x1a51000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:24.116330+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:25.116811+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1134230 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:26.117053+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:27.117505+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:28.117751+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:29.118212+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:30.118621+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1134230 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:31.118955+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:32.119200+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:33.119405+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:34.119772+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:35.120131+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1134230 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:36.120273+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922676e800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676e800 session 0x559226785a40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d4800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x5592250174a0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559224eea400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559224eea400 session 0x559225017680
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d4800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x559225016000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248f1800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.801321030s of 13.943515778s, submitted: 49
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:37.120431+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,15])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f1800 session 0x559225017c20
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922676e800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676e800 session 0x5592254705a0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af4800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4800 session 0x559225473e00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592248f1c00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f1c00 session 0x55922721eb40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d4800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x55922721f0e0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116989952 unmapped: 47742976 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:38.120551+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116989952 unmapped: 47742976 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:39.120823+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116989952 unmapped: 47742976 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:40.121029+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f937a000/0x0/0x4ffc00000, data 0x2232da6/0x22f2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1264351 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116989952 unmapped: 47742976 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:41.121324+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e4800 session 0x559224ef01e0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116989952 unmapped: 47742976 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f937a000/0x0/0x4ffc00000, data 0x2232da6/0x22f2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:42.121479+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b02c00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b02c00 session 0x5592255661e0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117309440 unmapped: 47423488 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:43.121639+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922549f000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922676e400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117309440 unmapped: 47423488 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:44.121904+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9355000/0x0/0x4ffc00000, data 0x2256dc9/0x2317000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117309440 unmapped: 47423488 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:45.122056+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9355000/0x0/0x4ffc00000, data 0x2256dc9/0x2317000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1303380 data_alloc: 234881024 data_used: 15511552
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 43442176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:46.122276+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125919232 unmapped: 38813696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9355000/0x0/0x4ffc00000, data 0x2256dc9/0x2317000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:47.122543+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125952000 unmapped: 38780928 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:48.122779+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.294668198s of 11.932563782s, submitted: 37
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549f000 session 0x55922657fe00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676e400 session 0x5592272270e0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125952000 unmapped: 38780928 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d4800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:49.123036+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118267904 unmapped: 46465024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:50.123280+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x559226552b40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1146182 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 46399488 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: mgrc ms_handle_reset ms_handle_reset con 0x5592249d8000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/4198923246
Jan 23 10:40:58 compute-2 ceph-osd[81231]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/4198923246,v1:192.168.122.100:6801/4198923246]
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: get_auth_request con 0x55922676e400 auth_method 0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: mgrc handle_mgr_configure stats_period=5
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:51.123497+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:52.123680+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:53.123870+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:54.124887+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x559225471a40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:55.125554+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1146182 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:56.126814+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:57.127419+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:58.128372+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:59.128971+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:00.129267+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1146182 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:01.129707+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:02.130136+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:03.130478+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:04.130619+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:05.130923+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1146182 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:06.131269+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:07.131577+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:08.131930+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:09.132190+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:10.132392+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1146182 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:11.132581+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261f0000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.419353485s of 23.247339249s, submitted: 36
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:12.132719+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:13.132912+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:14.133099+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:15.133231+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147694 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:16.133478+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:17.133867+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:18.134252+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:19.134486+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:20.134771+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147694 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:21.134887+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:22.135090+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:23.135298+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:24.135561+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:25.135763+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147694 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:26.135910+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:27.136089+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226f23000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226f23000 session 0x559226784f00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559227766000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559227766000 session 0x559226785c20
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b00800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b00800 session 0x5592254725a0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d4800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x5592254730e0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922549dc00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.210437775s of 15.619210243s, submitted: 1
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x5592263ff2c0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:28.136264+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118603776 unmapped: 46129152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:29.136405+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118603776 unmapped: 46129152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b28000/0x0/0x4ffc00000, data 0x1a85d96/0x1b44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:30.136582+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118603776 unmapped: 46129152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220376 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afb400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afb400 session 0x5592270fd860
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:31.136716+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118603776 unmapped: 46129152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b28000/0x0/0x4ffc00000, data 0x1a85d96/0x1b44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:32.136953+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118603776 unmapped: 46129152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226632400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632400 session 0x5592270fc960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d7400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d7400 session 0x559225567c20
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d4800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:33.137153+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118620160 unmapped: 46112768 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:34.137397+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118620160 unmapped: 46112768 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x559226ffbc20
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922549dc00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226632400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:35.137631+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118652928 unmapped: 46080000 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1221149 data_alloc: 234881024 data_used: 10539008
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:36.137802+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118652928 unmapped: 46080000 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b28000/0x0/0x4ffc00000, data 0x1a85d96/0x1b44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:37.137987+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118652928 unmapped: 46080000 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b28000/0x0/0x4ffc00000, data 0x1a85d96/0x1b44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:38.138161+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:39.138323+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:40.138478+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b28000/0x0/0x4ffc00000, data 0x1a85d96/0x1b44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279061 data_alloc: 234881024 data_used: 19120128
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:41.138676+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:42.138818+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:43.138959+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:44.139159+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:45.139335+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b28000/0x0/0x4ffc00000, data 0x1a85d96/0x1b44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279061 data_alloc: 234881024 data_used: 19120128
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:46.139474+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:47.139695+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:48.139835+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.080177307s of 20.722246170s, submitted: 23
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 43720704 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9265000/0x0/0x4ffc00000, data 0x2340d96/0x23ff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:49.140030+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126017536 unmapped: 38715392 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:50.140186+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126550016 unmapped: 38182912 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1390845 data_alloc: 234881024 data_used: 19333120
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:51.140268+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126550016 unmapped: 38182912 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8ec0000/0x0/0x4ffc00000, data 0x26e4d96/0x27a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:52.140377+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126566400 unmapped: 38166528 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:53.140574+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 39575552 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8ec9000/0x0/0x4ffc00000, data 0x26e4d96/0x27a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:54.140712+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 39542784 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:55.140931+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 39542784 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8ec9000/0x0/0x4ffc00000, data 0x26e4d96/0x27a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1382581 data_alloc: 234881024 data_used: 19349504
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:56.141098+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 39542784 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:57.141272+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 39542784 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:58.142536+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125485056 unmapped: 39247872 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:59.143626+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125485056 unmapped: 39247872 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:00.144008+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125485056 unmapped: 39247872 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1384645 data_alloc: 234881024 data_used: 19349504
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:01.144193+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8ea1000/0x0/0x4ffc00000, data 0x270cd96/0x27cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125485056 unmapped: 39247872 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:02.144968+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125485056 unmapped: 39247872 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x5592270fc000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.293542862s of 14.293901443s, submitted: 116
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632400 session 0x559226f250e0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b08800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b08800 session 0x55922677f2c0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:03.145467+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:04.146104+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:05.146389+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162987 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:06.146550+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:07.146818+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:08.147174+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:09.147492+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:10.147965+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:11.148196+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162987 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:12.148489+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:13.148668+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:14.148994+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:15.149136+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:16.149360+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162987 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:17.149709+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:18.150006+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:19.150278+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:20.150521+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:21.150765+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162987 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:22.151040+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:23.151268+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:24.151466+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d5800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d5800 session 0x5592272292c0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592247d4800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x559226fee3c0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922549dc00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x5592254efe00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226632400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632400 session 0x559226785c20
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b08800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.185562134s of 22.248020172s, submitted: 32
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b08800 session 0x5592254725a0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261e6000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e6000 session 0x55922723c960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922549dc00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x55922657e1e0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226632400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632400 session 0x5592270fde00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b08800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b08800 session 0x559225473680
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:25.151719+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 43679744 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:26.151970+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1209420 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 43679744 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:27.152239+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 43679744 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:28.152368+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 43679744 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226e3c800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226e3c800 session 0x559225017a40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa07c000/0x0/0x4ffc00000, data 0x1530da6/0x15f0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559225656800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559225656800 session 0x55922669f2c0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:29.152528+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 43679744 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922549dc00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x559226f24780
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226632400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632400 session 0x559226ffa3c0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:30.152662+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b08800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226e3c800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119414784 unmapped: 45318144 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:31.152819+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1216924 data_alloc: 234881024 data_used: 10543104
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119414784 unmapped: 45318144 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:32.153095+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:33.153270+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:34.153439+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa056000/0x0/0x4ffc00000, data 0x1554dd9/0x1616000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:35.153582+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:36.153745+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1246108 data_alloc: 234881024 data_used: 14667776
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:37.153931+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:38.154155+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa056000/0x0/0x4ffc00000, data 0x1554dd9/0x1616000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:39.154341+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:40.154558+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:41.154708+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1246108 data_alloc: 234881024 data_used: 14667776
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:42.155041+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.542800903s of 17.671800613s, submitted: 41
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122945536 unmapped: 41787392 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:43.155296+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 124780544 unmapped: 39952384 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9869000/0x0/0x4ffc00000, data 0x1d41dd9/0x1e03000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:44.155453+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125403136 unmapped: 39329792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:45.155633+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f97dc000/0x0/0x4ffc00000, data 0x1dc8dd9/0x1e8a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226864c00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226864c00 session 0x55922677f860
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226863400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226863400 session 0x55922721f0e0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922723b000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922723b000 session 0x559226aeb680
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922723b000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922723b000 session 0x559225016000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 124108800 unmapped: 40624128 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922549dc00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x55922669e1e0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226632400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632400 session 0x55922657f4a0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226863400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226863400 session 0x559224649680
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226864c00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226864c00 session 0x559224ef0f00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559225656c00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559225656c00 session 0x55922723cf00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:46.155808+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1386172 data_alloc: 234881024 data_used: 16515072
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 39067648 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:47.156059+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 39067648 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:48.156222+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9129000/0x0/0x4ffc00000, data 0x2477e4b/0x253b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 39034880 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:49.156369+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 39034880 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:50.156514+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 39034880 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226634400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226634400 session 0x55922723da40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:51.156643+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1379724 data_alloc: 234881024 data_used: 16515072
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922549c400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549c400 session 0x5592267852c0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 39034880 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226af4400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4400 session 0x559225625680
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:52.156795+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261ecc00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ecc00 session 0x5592270fd860
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 39034880 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:53.156926+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261ecc00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.054047585s of 10.986348152s, submitted: 167
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f910f000/0x0/0x4ffc00000, data 0x2498e6e/0x255d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126746624 unmapped: 37986304 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:54.157067+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126754816 unmapped: 37978112 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f910f000/0x0/0x4ffc00000, data 0x2498e6e/0x255d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:55.157219+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128802816 unmapped: 35930112 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:56.157375+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1419509 data_alloc: 234881024 data_used: 22065152
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128811008 unmapped: 35921920 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:57.158185+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128811008 unmapped: 35921920 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:58.158337+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128811008 unmapped: 35921920 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f910f000/0x0/0x4ffc00000, data 0x2498e6e/0x255d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:59.158536+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128843776 unmapped: 35889152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:00.158670+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128843776 unmapped: 35889152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:01.158816+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1419509 data_alloc: 234881024 data_used: 22065152
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128843776 unmapped: 35889152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:02.158950+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128843776 unmapped: 35889152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:03.161591+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128843776 unmapped: 35889152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f910f000/0x0/0x4ffc00000, data 0x2498e6e/0x255d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.671233177s of 10.673833847s, submitted: 1
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:04.162596+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128876544 unmapped: 35856384 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f910c000/0x0/0x4ffc00000, data 0x249be6e/0x2560000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:05.162782+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 133996544 unmapped: 30736384 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:06.163333+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1494901 data_alloc: 234881024 data_used: 23621632
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 133668864 unmapped: 31064064 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:07.163990+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134979584 unmapped: 29753344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:08.164089+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134979584 unmapped: 29753344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:09.164584+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134979584 unmapped: 29753344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:10.164786+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134979584 unmapped: 29753344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88d4000/0x0/0x4ffc00000, data 0x2cd3e6e/0x2d98000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:11.165017+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1505309 data_alloc: 234881024 data_used: 24363008
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134987776 unmapped: 29745152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:12.165291+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134987776 unmapped: 29745152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:13.165531+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134987776 unmapped: 29745152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88d4000/0x0/0x4ffc00000, data 0x2cd3e6e/0x2d98000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:14.166028+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134987776 unmapped: 29745152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:15.166333+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88d4000/0x0/0x4ffc00000, data 0x2cd3e6e/0x2d98000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134987776 unmapped: 29745152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:16.166579+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.098537445s of 12.288821220s, submitted: 94
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1502669 data_alloc: 234881024 data_used: 24371200
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135053312 unmapped: 29679616 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:17.166905+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:18.167068+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88d2000/0x0/0x4ffc00000, data 0x2cd4e6e/0x2d99000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88d2000/0x0/0x4ffc00000, data 0x2cd4e6e/0x2d99000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:19.167363+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:20.167631+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:21.167870+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1503037 data_alloc: 234881024 data_used: 24436736
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:22.168022+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88cd000/0x0/0x4ffc00000, data 0x2cdae6e/0x2d9f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:23.168245+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:24.168399+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ecc00 session 0x55922669f2c0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226f23400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135069696 unmapped: 29663232 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:25.168548+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135094272 unmapped: 29638656 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:26.168780+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1348975 data_alloc: 234881024 data_used: 16515072
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 7.806308746s of 10.239793777s, submitted: 37
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88ce000/0x0/0x4ffc00000, data 0x2cdae5e/0x2d9e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,2])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:27.168946+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9663000/0x0/0x4ffc00000, data 0x1f46dfc/0x2009000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:28.169135+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9663000/0x0/0x4ffc00000, data 0x1f46dfc/0x2009000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:29.169339+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226f23400 session 0x559226784f00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9663000/0x0/0x4ffc00000, data 0x1f46dd9/0x2008000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:30.169550+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:31.169891+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1347011 data_alloc: 234881024 data_used: 16498688
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:32.170069+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:33.170185+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9663000/0x0/0x4ffc00000, data 0x1f46dd9/0x2008000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:34.170376+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:35.170566+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:36.170775+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1347011 data_alloc: 234881024 data_used: 16498688
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9663000/0x0/0x4ffc00000, data 0x1f46dd9/0x2008000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:37.171047+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132038656 unmapped: 32694272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:38.171300+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132038656 unmapped: 32694272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:39.171772+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132038656 unmapped: 32694272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:40.172212+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.272031784s of 13.655331612s, submitted: 34
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b08800 session 0x55922546f0e0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132055040 unmapped: 32677888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9663000/0x0/0x4ffc00000, data 0x1f46dd9/0x2008000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:41.172367+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1346027 data_alloc: 234881024 data_used: 16498688
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226e3c800 session 0x5592254ef0e0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132055040 unmapped: 32677888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:42.172686+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132055040 unmapped: 32677888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226865000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:43.172875+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127320064 unmapped: 37412864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:44.173013+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:45.173377+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:46.173608+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1199806 data_alloc: 234881024 data_used: 10649600
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa48d000/0x0/0x4ffc00000, data 0x111fdb9/0x11df000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:47.173914+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226865000 session 0x559223b86960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:48.174123+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:49.174383+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:50.174518+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:51.174920+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:52.175262+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:53.175428+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:54.175700+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:55.176033+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:56.176192+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:57.176552+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:58.176856+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:59.177052+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:00.177252+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:01.177491+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:02.177659+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:03.177809+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:04.177959+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:05.178129+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:06.178295+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:07.178519+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:08.178681+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:09.179091+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:10.179472+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:11.179820+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127352832 unmapped: 37380096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:12.180247+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127361024 unmapped: 37371904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:13.180636+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127361024 unmapped: 37371904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:14.181002+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127361024 unmapped: 37371904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:15.181311+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127361024 unmapped: 37371904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:16.181783+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127361024 unmapped: 37371904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:17.182227+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127361024 unmapped: 37371904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:18.182435+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127369216 unmapped: 37363712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:19.182785+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127369216 unmapped: 37363712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:20.183086+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127369216 unmapped: 37363712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:21.183381+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127369216 unmapped: 37363712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:22.183611+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127369216 unmapped: 37363712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261ef000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ef000 session 0x559226aea3c0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559225657800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559225657800 session 0x5592263feb40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559225657400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559225657400 session 0x559226fef0e0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226e3ac00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226e3ac00 session 0x55922669f680
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x55922676f400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 36.472564697s of 42.319786072s, submitted: 56
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:23.183905+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127352832 unmapped: 37380096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:24.184153+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127352832 unmapped: 37380096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:25.184272+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127377408 unmapped: 37355520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:26.184481+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127959040 unmapped: 36773888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676f400 session 0x5592247d85a0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559225657400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559225657400 session 0x5592263fe780
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559225657800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559225657800 session 0x55922721fe00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261ef000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156edcf/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1234189 data_alloc: 234881024 data_used: 10539008
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:27.184803+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127959040 unmapped: 36773888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:28.185019+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127959040 unmapped: 36773888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ef000 session 0x559224648000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226e3ac00
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226e3ac00 session 0x559226fee960
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:29.185195+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127959040 unmapped: 36773888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:30.185385+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127959040 unmapped: 36773888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226b0a000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:31.185526+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127967232 unmapped: 36765696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1234133 data_alloc: 234881024 data_used: 10539008
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:32.185784+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127967232 unmapped: 36765696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,0,0,1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:33.185991+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127967232 unmapped: 36765696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:34.186209+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 2.432877302s of 11.771712303s, submitted: 33
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127967232 unmapped: 36765696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b0a000 session 0x559225473c20
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:35.186421+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127991808 unmapped: 36741120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261f1400
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226f22000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:36.186598+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127991808 unmapped: 36741120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1235130 data_alloc: 234881024 data_used: 10539008
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:37.186933+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127991808 unmapped: 36741120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:38.187185+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127967232 unmapped: 36765696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:39.187411+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127385600 unmapped: 37347328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:40.187661+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127385600 unmapped: 37347328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:41.187813+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127385600 unmapped: 37347328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1265986 data_alloc: 234881024 data_used: 15028224
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:42.187952+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127385600 unmapped: 37347328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:43.188140+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127385600 unmapped: 37347328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:44.188348+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127393792 unmapped: 37339136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:45.188601+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127393792 unmapped: 37339136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:46.188775+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127393792 unmapped: 37339136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1265986 data_alloc: 234881024 data_used: 15028224
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:47.188965+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127393792 unmapped: 37339136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:48.189114+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127393792 unmapped: 37339136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:49.189257+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.148657799s of 15.006252289s, submitted: 6
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127393792 unmapped: 37339136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:50.189376+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132644864 unmapped: 32088064 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:51.189489+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128950272 unmapped: 35782656 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1302734 data_alloc: 234881024 data_used: 15024128
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:52.189639+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b43000/0x0/0x4ffc00000, data 0x1a68e08/0x1b29000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,2,1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128950272 unmapped: 35782656 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:53.189838+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130285568 unmapped: 34447360 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:54.190054+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129679360 unmapped: 35053568 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f997a000/0x0/0x4ffc00000, data 0x1c29e08/0x1cea000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,4])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:55.190268+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129687552 unmapped: 35045376 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:56.190389+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1328778 data_alloc: 234881024 data_used: 15020032
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:57.190598+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f96ff000/0x0/0x4ffc00000, data 0x1eace08/0x1f6d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130875392 unmapped: 33857536 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:58.190893+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130883584 unmapped: 33849344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:59.191074+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130883584 unmapped: 33849344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 1.013013244s of 10.206089973s, submitted: 61
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f96ff000/0x0/0x4ffc00000, data 0x1eace08/0x1f6d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:00.191267+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130883584 unmapped: 33849344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:01.191438+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130883584 unmapped: 33849344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1333106 data_alloc: 234881024 data_used: 15360000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:02.191645+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130883584 unmapped: 33849344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:03.191826+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130916352 unmapped: 33816576 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:04.191977+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130916352 unmapped: 33816576 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:05.192135+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130916352 unmapped: 33816576 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f96e2000/0x0/0x4ffc00000, data 0x1ec9e08/0x1f8a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:06.192282+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130916352 unmapped: 33816576 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1348124 data_alloc: 234881024 data_used: 15777792
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:07.192507+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130916352 unmapped: 33816576 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:08.192670+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130932736 unmapped: 33800192 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:09.192863+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130932736 unmapped: 33800192 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:10.193002+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130932736 unmapped: 33800192 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.314796448s of 11.320782661s, submitted: 31
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261f1400 session 0x55922669fc20
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226f22000 session 0x559226fef4a0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x559226afa000
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:11.193125+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129081344 unmapped: 35651584 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f96e2000/0x0/0x4ffc00000, data 0x1ec9e08/0x1f8a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1207674 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:12.193544+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129089536 unmapped: 35643392 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afa000 session 0x559226aeb680
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:13.194324+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:14.194848+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:15.195121+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:16.195434+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:17.195659+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:18.196075+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:19.196424+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:20.197008+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:21.197315+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:22.197634+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:23.197937+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:24.198105+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:25.198240+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:26.198433+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:27.198656+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:28.198902+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:29.199052+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:30.199199+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:31.199362+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:32.199582+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:33.199816+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:34.200022+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:35.200192+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:36.200361+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:37.200650+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:38.200904+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:39.201087+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:40.201312+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:41.201544+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:42.201715+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:43.202100+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:44.202294+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:45.202483+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:46.202681+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:47.202928+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:48.203124+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:49.203322+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:50.203538+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:51.203708+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:52.203888+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:53.204029+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:54.204200+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:55.204411+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:56.204560+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:57.204743+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:58.204910+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129146880 unmapped: 35586048 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:59.205046+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129146880 unmapped: 35586048 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:00.205210+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129146880 unmapped: 35586048 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:01.205379+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129146880 unmapped: 35586048 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:02.205561+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129146880 unmapped: 35586048 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:03.205821+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:04.205950+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:05.206109+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:06.206297+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:07.206536+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:08.206747+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:09.206921+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:10.207075+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:11.207194+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:12.207339+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:13.207517+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:14.207808+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129163264 unmapped: 35569664 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:15.208018+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129163264 unmapped: 35569664 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:16.208227+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129163264 unmapped: 35569664 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:17.208477+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:18.208667+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:19.208835+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:20.209137+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:21.209317+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:22.209500+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:23.209650+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:24.210296+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:25.210594+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:26.210895+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:27.211769+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129179648 unmapped: 35553280 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:28.211976+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129179648 unmapped: 35553280 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:29.212174+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:30.212416+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:31.212627+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:32.212841+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:33.213062+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:34.213278+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:35.213561+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:36.213916+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:37.214194+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:38.214433+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:39.214653+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:40.214831+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:41.215011+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:42.215136+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129228800 unmapped: 35504128 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:43.280094+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: do_command 'config diff' '{prefix=config diff}'
Jan 23 10:40:58 compute-2 ceph-osd[81231]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 23 10:40:58 compute-2 ceph-osd[81231]: do_command 'config show' '{prefix=config show}'
Jan 23 10:40:58 compute-2 ceph-osd[81231]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 23 10:40:58 compute-2 ceph-osd[81231]: do_command 'counter dump' '{prefix=counter dump}'
Jan 23 10:40:58 compute-2 ceph-osd[81231]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 23 10:40:58 compute-2 ceph-osd[81231]: do_command 'counter schema' '{prefix=counter schema}'
Jan 23 10:40:58 compute-2 ceph-osd[81231]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:44.280489+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128770048 unmapped: 35962880 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:45.280717+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128917504 unmapped: 35815424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: do_command 'log dump' '{prefix=log dump}'
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:46.281047+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Jan 23 10:40:58 compute-2 ceph-osd[81231]: do_command 'perf dump' '{prefix=perf dump}'
Jan 23 10:40:58 compute-2 ceph-osd[81231]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Jan 23 10:40:58 compute-2 ceph-osd[81231]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Jan 23 10:40:58 compute-2 ceph-osd[81231]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Jan 23 10:40:58 compute-2 ceph-osd[81231]: do_command 'perf schema' '{prefix=perf schema}'
Jan 23 10:40:58 compute-2 ceph-osd[81231]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128909312 unmapped: 35823616 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:47.281303+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128950272 unmapped: 35782656 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:48.281539+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128950272 unmapped: 35782656 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:49.281675+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128950272 unmapped: 35782656 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:50.281802+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128950272 unmapped: 35782656 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:51.281928+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128950272 unmapped: 35782656 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:52.282325+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128950272 unmapped: 35782656 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:53.282538+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128950272 unmapped: 35782656 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:54.282949+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128958464 unmapped: 35774464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:55.283140+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128958464 unmapped: 35774464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:56.283324+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128958464 unmapped: 35774464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:57.283522+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128958464 unmapped: 35774464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:58.283681+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128958464 unmapped: 35774464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:59.283872+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128958464 unmapped: 35774464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:00.284065+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128958464 unmapped: 35774464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:01.284196+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128958464 unmapped: 35774464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:02.284552+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128958464 unmapped: 35774464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:03.284806+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128958464 unmapped: 35774464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:04.284920+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128958464 unmapped: 35774464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:05.285075+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128958464 unmapped: 35774464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:06.285202+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128958464 unmapped: 35774464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:07.285364+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128966656 unmapped: 35766272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:08.285504+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128966656 unmapped: 35766272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:09.285635+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128966656 unmapped: 35766272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:10.285811+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128966656 unmapped: 35766272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:11.285949+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128966656 unmapped: 35766272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:12.286114+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128966656 unmapped: 35766272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:13.286239+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128966656 unmapped: 35766272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:14.286480+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128966656 unmapped: 35766272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:15.286624+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128966656 unmapped: 35766272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:16.286815+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128966656 unmapped: 35766272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:17.287063+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128966656 unmapped: 35766272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:18.287188+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128966656 unmapped: 35766272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:19.287321+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128974848 unmapped: 35758080 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:20.287451+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128974848 unmapped: 35758080 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:21.287591+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128974848 unmapped: 35758080 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:22.287774+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128974848 unmapped: 35758080 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:23.288006+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128974848 unmapped: 35758080 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:24.288237+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128974848 unmapped: 35758080 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:25.288462+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128974848 unmapped: 35758080 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:26.288829+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128974848 unmapped: 35758080 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:27.289189+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128974848 unmapped: 35758080 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:28.289351+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128974848 unmapped: 35758080 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:29.289535+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128974848 unmapped: 35758080 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:30.289806+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128974848 unmapped: 35758080 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:31.290017+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128974848 unmapped: 35758080 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:32.290226+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:33.290419+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:34.290663+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:35.290853+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:36.291006+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:37.291192+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:38.291331+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:39.291491+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:40.291806+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:41.291980+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:42.292170+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:43.292345+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:44.292537+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:45.292752+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:46.292930+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:47.293159+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:48.293316+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:49.293481+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:50.293652+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:51.293867+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:52.294051+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:53.294242+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:54.295909+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:55.296194+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:56.296443+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:57.296687+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:58.297411+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:59.297780+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:00.297983+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:01.298289+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:02.298469+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:03.298844+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:04.299142+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:05.299379+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:06.300308+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:07.300542+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:08.300707+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:09.300905+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:10.301281+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:11.301445+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:12.301599+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:13.301767+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:14.301915+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:15.302102+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:16.302250+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:17.302436+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:18.302825+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:19.303076+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:20.303271+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:21.303442+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:22.303681+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:23.303847+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:24.303996+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:25.304135+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:26.304408+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:27.304889+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:28.305105+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:29.305311+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:30.305487+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:31.305679+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 2400.1 total, 600.0 interval
                                           Cumulative writes: 9973 writes, 39K keys, 9973 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s
                                           Cumulative WAL: 9973 writes, 2660 syncs, 3.75 writes per sync, written: 0.03 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2178 writes, 7712 keys, 2178 commit groups, 1.0 writes per commit group, ingest: 7.91 MB, 0.01 MB/s
                                           Interval WAL: 2178 writes, 901 syncs, 2.42 writes per sync, written: 0.01 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:32.305991+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:33.306304+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:34.306493+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:35.306670+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:36.306818+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:37.307102+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:38.307392+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:39.307581+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:40.307832+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:41.308083+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:42.308276+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:43.308448+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:44.308644+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129015808 unmapped: 35717120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:45.308824+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129015808 unmapped: 35717120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:46.308982+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129015808 unmapped: 35717120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:47.309137+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129015808 unmapped: 35717120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:48.309358+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129015808 unmapped: 35717120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:49.309561+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129015808 unmapped: 35717120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:50.309707+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129015808 unmapped: 35717120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:51.309916+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129015808 unmapped: 35717120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:52.310058+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129015808 unmapped: 35717120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:53.310216+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129015808 unmapped: 35717120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:54.310448+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129015808 unmapped: 35717120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:55.310569+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129024000 unmapped: 35708928 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:56.310814+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129024000 unmapped: 35708928 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:57.310995+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129024000 unmapped: 35708928 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:58.311139+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129024000 unmapped: 35708928 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:59.311288+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129024000 unmapped: 35708928 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:00.311487+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129024000 unmapped: 35708928 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:01.311667+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129024000 unmapped: 35708928 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:02.311811+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129024000 unmapped: 35708928 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:03.311957+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129032192 unmapped: 35700736 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:04.312087+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129032192 unmapped: 35700736 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 232.847518921s of 234.084274292s, submitted: 40
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:05.312252+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129032192 unmapped: 35700736 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:06.312378+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,1,0,1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129056768 unmapped: 35676160 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:07.312566+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129064960 unmapped: 35667968 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:08.312709+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129064960 unmapped: 35667968 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:09.312913+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129073152 unmapped: 35659776 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:10.313104+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129073152 unmapped: 35659776 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:11.313215+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129097728 unmapped: 35635200 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:12.313313+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:13.313469+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,1])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:14.313609+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129253376 unmapped: 35479552 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 3.775661469s of 10.127549171s, submitted: 153
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:15.313805+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129286144 unmapped: 35446784 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,2])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:16.313995+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129294336 unmapped: 35438592 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:17.314179+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129351680 unmapped: 35381248 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:18.314316+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129351680 unmapped: 35381248 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:19.314442+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:20.314569+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:21.314692+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:22.314825+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:23.314968+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:24.315147+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:25.315275+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:26.315403+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:27.315548+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:28.315689+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:29.315879+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:30.316034+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:31.316169+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:32.316315+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:33.316438+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:34.316590+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:35.316711+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:36.316867+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:37.317043+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:38.317132+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:39.317264+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:40.317435+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:41.317607+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:42.317803+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:43.317923+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:44.318044+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:45.318155+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:46.318293+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:47.318672+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:48.318977+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:49.319148+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:50.319277+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:51.319436+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:52.319599+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:53.319743+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:54.319885+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:55.320026+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:56.320165+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:57.320349+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:58.320475+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:59.320609+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:00.320798+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:01.320968+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:02.321107+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:03.321258+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:04.321397+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:05.321561+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:06.321704+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:07.321904+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:08.322045+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:09.322187+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:10.322327+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:11.322474+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:12.322757+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:13.322886+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:14.323036+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:15.323167+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:16.323305+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:17.323480+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:18.323629+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:19.323803+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:20.323990+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:21.324224+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:22.324376+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:23.324504+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:24.324658+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:25.324845+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:26.325087+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:27.325279+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:28.325460+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:29.325616+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:30.325788+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:31.325958+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:32.326132+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:33.326280+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:34.326427+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:35.326541+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:36.326954+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:37.327402+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:38.327642+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:39.327843+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:40.327987+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:41.328143+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:42.328301+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:43.328442+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:44.328581+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:45.328694+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:46.328829+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:47.328976+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 35340288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:48.329104+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 35340288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:49.329238+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 35340288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:50.329358+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 35340288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:51.329524+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 35340288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:52.329694+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 35340288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:53.329817+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 35340288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:54.329933+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 35340288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:55.330075+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 35340288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:56.330192+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 35340288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:57.330374+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 35340288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:58.330538+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 35340288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:59.330693+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 35340288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:00.330858+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:01.331017+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:02.331149+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:03.331395+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:04.331571+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:05.331922+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:06.332097+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:07.332314+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:08.332488+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:09.332621+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:10.332798+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:11.332950+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:12.333087+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:13.333227+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:14.333484+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:15.333619+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:16.333761+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:17.333988+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:18.334168+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:19.334387+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:20.334531+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:21.334783+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:22.334924+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:23.335067+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:24.335255+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:25.335412+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:26.335567+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:27.335781+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:28.335969+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:29.336120+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:30.336242+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:31.336373+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:32.336544+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:33.336702+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:34.336904+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:35.337069+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:36.337232+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:37.337397+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:38.337568+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:39.337777+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:40.337930+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:41.338087+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:42.338221+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:43.338343+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:44.338500+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:45.338688+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:46.338859+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:47.339050+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:48.339319+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:49.339664+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:50.339871+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:51.340168+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:52.340337+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:53.340481+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:54.340658+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:55.340908+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:56.341147+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:57.341404+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:58.341589+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:59.341770+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:00.341930+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:01.342071+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:02.342229+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:03.342379+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:04.342509+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:05.342628+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:06.342796+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:07.343016+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:08.343234+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:09.343635+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:10.343906+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:11.344150+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:12.344300+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:13.344472+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:14.344604+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:15.344772+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:16.344911+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:17.345086+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:18.345232+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:19.345375+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:20.345514+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:21.345661+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:22.345801+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:23.345917+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:24.346050+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:25.346195+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:26.346339+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:27.346503+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:28.346642+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:29.346782+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:30.346925+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:31.347061+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:32.347214+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:33.347359+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:34.347496+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:35.347638+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:36.347763+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:37.347935+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:38.348056+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:39.348183+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:40.348383+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:41.348524+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:42.348661+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:43.348858+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:44.349025+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:45.349171+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:46.349308+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:47.349474+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:48.349603+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:49.349754+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:50.349949+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:51.350365+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:52.350681+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:53.350788+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 35274752 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:54.351015+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 35274752 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:55.351290+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 35274752 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:56.351627+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 35274752 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:57.352000+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 35274752 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:58.352161+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 35274752 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:59.352458+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 35274752 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:00.352594+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 35274752 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:01.352868+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 35274752 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:02.353168+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 35274752 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:03.353466+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:04.353769+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:05.353934+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:06.354103+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:07.354321+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:08.354470+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:09.354647+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:10.354797+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:11.354991+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:12.355179+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:13.355402+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:14.355597+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:15.355843+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:16.356001+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:17.356260+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:18.356522+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:19.356708+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 35258368 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:20.356881+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 35258368 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:21.357103+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 35258368 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:22.357330+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 35258368 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:23.357514+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 35258368 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:24.357750+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 35258368 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:25.357917+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 35258368 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:26.358097+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 35258368 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:27.358318+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 35258368 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:28.358532+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 35258368 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:29.358811+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 35258368 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:30.358958+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 35258368 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:31.359132+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets getting new tickets!
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:32.359542+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _finish_auth 0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:32.361025+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:33.359752+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:34.360012+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:35.360196+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:36.360529+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:37.360783+0000)
Jan 23 10:40:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:38.361017+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:39.361269+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:40.361605+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:41.361786+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:42.361935+0000)
Jan 23 10:40:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:58.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:43.362120+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:44.362270+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:45.362391+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:46.362562+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:47.362747+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 35241984 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:48.362909+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 35241984 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:49.363038+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 35241984 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:50.363157+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 35241984 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:51.363415+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 35241984 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:52.363609+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 35241984 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:53.363752+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 35241984 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:54.363893+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 35241984 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:55.364046+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 35241984 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:56.364227+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 35241984 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:57.364434+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 35241984 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:58.364581+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 35241984 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:59.364713+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:00.364882+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:01.365028+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b06000 session 0x5592255fed20
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: handle_auth_request added challenge on 0x5592261f0800
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:02.365183+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:03.365322+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:04.365433+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:05.365564+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:06.365689+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:07.365842+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:08.365932+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:09.366058+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:10.366188+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:11.366351+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:12.366564+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:13.366782+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:14.366923+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129507328 unmapped: 35225600 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:15.367152+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129507328 unmapped: 35225600 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:16.367342+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129507328 unmapped: 35225600 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:17.367575+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129507328 unmapped: 35225600 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:18.367753+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129507328 unmapped: 35225600 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:19.367929+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129507328 unmapped: 35225600 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:20.368141+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129507328 unmapped: 35225600 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:21.368446+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129507328 unmapped: 35225600 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:22.368681+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:23.368863+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:24.369008+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:25.369263+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:26.369420+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:27.369628+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:28.369794+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:29.369980+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:30.370229+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:31.370400+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:32.370546+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:33.370674+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:34.370813+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:35.370950+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:36.371082+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 35209216 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:37.371276+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 35209216 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:38.371416+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 35209216 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:39.371584+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 35209216 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:40.371756+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 35209216 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:41.371967+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 35209216 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:42.372137+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 35209216 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:43.372271+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 35209216 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:44.372426+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 35209216 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:45.372562+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 35209216 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:46.372694+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 35209216 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:47.372904+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:48.373096+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:49.373266+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:50.373418+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:51.373586+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:52.373793+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:53.373920+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:54.374050+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:55.374214+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:56.374340+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:57.374503+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:58.374821+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:59.374982+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:00.375150+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:01.375287+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:02.375436+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:03.375560+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:04.375704+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:05.375884+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:06.376037+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:07.376220+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:08.376343+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129540096 unmapped: 35192832 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:09.376482+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129540096 unmapped: 35192832 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:10.376623+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129540096 unmapped: 35192832 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:11.376791+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129540096 unmapped: 35192832 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:12.376979+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129540096 unmapped: 35192832 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:13.377118+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129540096 unmapped: 35192832 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:14.377246+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129540096 unmapped: 35192832 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:15.377381+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129540096 unmapped: 35192832 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:16.377525+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129540096 unmapped: 35192832 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:17.377714+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129540096 unmapped: 35192832 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:18.377892+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129540096 unmapped: 35192832 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:19.378040+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:20.378177+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:21.378322+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:22.378465+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:23.378603+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:24.378764+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:25.378926+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:26.379074+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:27.379259+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:28.379388+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:29.379518+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:30.379643+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:31.379774+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:32.380006+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:33.380143+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:34.380278+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129556480 unmapped: 35176448 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:35.380447+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129556480 unmapped: 35176448 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:36.380642+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129556480 unmapped: 35176448 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:37.380819+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129556480 unmapped: 35176448 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:38.380935+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129556480 unmapped: 35176448 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:39.381058+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129556480 unmapped: 35176448 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:40.381261+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129556480 unmapped: 35176448 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:41.381437+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129556480 unmapped: 35176448 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:42.381578+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129556480 unmapped: 35176448 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:43.381701+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129564672 unmapped: 35168256 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:44.381866+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129564672 unmapped: 35168256 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:45.382047+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129564672 unmapped: 35168256 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:46.382195+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129564672 unmapped: 35168256 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:47.382372+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129564672 unmapped: 35168256 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:48.382509+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129564672 unmapped: 35168256 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:49.382634+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129564672 unmapped: 35168256 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:50.382769+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129564672 unmapped: 35168256 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:51.382961+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129564672 unmapped: 35168256 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:52.383089+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129564672 unmapped: 35168256 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:53.383232+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129564672 unmapped: 35168256 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:54.383357+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129564672 unmapped: 35168256 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:55.383497+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:56.383678+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:57.383907+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:58.384056+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:59.384219+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:00.384353+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:01.384486+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:02.384625+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:03.384779+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:04.384908+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:05.385025+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:06.385156+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:07.385375+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:08.385539+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:09.385690+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:10.385859+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:11.386002+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:12.386141+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:13.386295+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:14.386442+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:15.386578+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:16.386778+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:17.386946+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:18.387118+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:19.387239+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:20.387378+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:21.387510+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:22.387645+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:23.387772+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:24.387918+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:25.388060+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:26.388197+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:27.388439+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:28.388591+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:29.389370+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:30.390453+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:31.390689+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:32.391214+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:33.391501+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:34.391657+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:35.391902+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:36.392183+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:37.392648+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:38.393027+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:39.393256+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:40.393607+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:41.393970+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:42.394218+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:43.394381+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:44.394564+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:45.394814+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:46.394961+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:47.395132+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:48.395271+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:49.395446+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:50.395634+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:51.395826+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:52.396105+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:53.396359+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:54.396502+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:55.396714+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:56.396886+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:57.397075+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:58.397195+0000)
Jan 23 10:40:58 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:58 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:59.397329+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:00.397463+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:01.397663+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:02.397817+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:03.398004+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:04.398176+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:05.398435+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:06.398778+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:07.399071+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:08.399224+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:09.399472+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:10.399779+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:11.400053+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:12.400304+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:13.400466+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:14.400609+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:15.400747+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:16.400964+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:17.401169+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:18.401321+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:19.401468+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:20.401622+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:21.401775+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:22.401934+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:23.402094+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:24.402210+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:40:58 compute-2 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:40:58 compute-2 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:25.402326+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: do_command 'config diff' '{prefix=config diff}'
Jan 23 10:40:58 compute-2 ceph-osd[81231]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: do_command 'config show' '{prefix=config show}'
Jan 23 10:40:58 compute-2 ceph-osd[81231]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 23 10:40:58 compute-2 ceph-osd[81231]: do_command 'counter dump' '{prefix=counter dump}'
Jan 23 10:40:58 compute-2 ceph-osd[81231]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 23 10:40:58 compute-2 ceph-osd[81231]: do_command 'counter schema' '{prefix=counter schema}'
Jan 23 10:40:58 compute-2 ceph-osd[81231]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:26.402458+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129204224 unmapped: 35528704 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: tick
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_tickets
Jan 23 10:40:58 compute-2 ceph-osd[81231]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:27.402617+0000)
Jan 23 10:40:58 compute-2 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129359872 unmapped: 35373056 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 10:40:58 compute-2 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 10:40:58 compute-2 ceph-osd[81231]: do_command 'log dump' '{prefix=log dump}'
Jan 23 10:40:58 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:40:58 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:40:58 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:58.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:40:58 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 23 10:40:58 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3124684249' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:40:58 compute-2 ceph-mon[75771]: from='client.27295 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:58 compute-2 ceph-mon[75771]: from='client.17769 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:58 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/365728033' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:40:58 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3781180657' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 23 10:40:58 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/246354880' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 23 10:40:58 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3056118106' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:40:58 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/529095491' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 23 10:40:58 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2687217909' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 23 10:40:58 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3547435586' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:40:58 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/823562891' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:40:58 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3096053969' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 23 10:40:58 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3182871652' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 23 10:40:58 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3408741616' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 23 10:40:58 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/458587115' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 23 10:40:58 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 23 10:40:58 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/666089249' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:40:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:59 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:59 2026: (VI_0) received an invalid passwd!
Jan 23 10:40:59 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 23 10:40:59 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1596010367' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:40:59 compute-2 ceph-mon[75771]: pgmap v1439: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:40:59 compute-2 ceph-mon[75771]: from='client.17787 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:59 compute-2 ceph-mon[75771]: from='client.27331 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:59 compute-2 ceph-mon[75771]: from='client.17808 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:59 compute-2 ceph-mon[75771]: from='client.27346 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:59 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3124684249' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:40:59 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1271257117' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:40:59 compute-2 ceph-mon[75771]: from='client.17820 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:59 compute-2 ceph-mon[75771]: from='client.27361 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:59 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/445151465' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:40:59 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3276688100' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 23 10:40:59 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/666089249' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:40:59 compute-2 ceph-mon[75771]: from='client.17841 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:59 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1254495411' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:40:59 compute-2 ceph-mon[75771]: from='client.27385 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:59 compute-2 ceph-mon[75771]: from='client.27185 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:59 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1212026481' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:40:59 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1596010367' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:40:59 compute-2 ceph-mon[75771]: from='client.17862 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:59 compute-2 crontab[251938]: (root) LIST (root)
Jan 23 10:40:59 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 23 10:40:59 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/239682889' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:41:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:41:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:41:00 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 23 10:41:00 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1745865103' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:41:00 compute-2 nova_compute[225701]: 2026-01-23 10:41:00.323 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:41:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:41:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:41:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:00.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:41:00 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:41:00 2026: (VI_0) received an invalid passwd!
Jan 23 10:41:00 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:41:00 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:41:00 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:00.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:41:00 compute-2 ceph-mon[75771]: from='client.27403 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:00 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3477516146' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:41:00 compute-2 ceph-mon[75771]: pgmap v1440: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:41:00 compute-2 ceph-mon[75771]: from='client.27200 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:00 compute-2 ceph-mon[75771]: from='client.17871 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:00 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/239682889' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:41:00 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1315781543' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:41:00 compute-2 ceph-mon[75771]: from='client.27418 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:00 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2044172642' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:41:00 compute-2 ceph-mon[75771]: from='client.27221 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:00 compute-2 ceph-mon[75771]: from='client.17886 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:00 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1745865103' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:41:00 compute-2 ceph-mon[75771]: from='client.27436 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:00 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2564409329' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:41:00 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/716455899' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:41:00 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Jan 23 10:41:00 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/875828868' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 23 10:41:00 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Jan 23 10:41:00 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2414639810' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 23 10:41:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:41:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:41:01 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:41:01 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:41:01 2026: (VI_0) received an invalid passwd!
Jan 23 10:41:01 compute-2 ceph-mon[75771]: from='client.27239 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:01 compute-2 ceph-mon[75771]: from='client.17910 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:01 compute-2 ceph-mon[75771]: from='client.27457 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:01 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/875828868' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 23 10:41:01 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2414639810' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 23 10:41:01 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/148398286' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:41:01 compute-2 ceph-mon[75771]: from='client.17925 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:01 compute-2 ceph-mon[75771]: from='client.27254 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:01 compute-2 ceph-mon[75771]: from='client.27469 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:01 compute-2 ceph-mon[75771]: from='client.17937 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:01 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1085532514' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:41:01 compute-2 ceph-mon[75771]: from='client.27269 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:01 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Jan 23 10:41:01 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1637687014' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 23 10:41:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:41:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:41:02 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Jan 23 10:41:02 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/233774706' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 23 10:41:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:41:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:41:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:02.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:41:02 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:41:02 2026: (VI_0) received an invalid passwd!
Jan 23 10:41:02 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Jan 23 10:41:02 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3108936715' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 23 10:41:02 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:41:02 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:41:02 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:02.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:41:02 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Jan 23 10:41:02 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1660365616' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 23 10:41:02 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Jan 23 10:41:02 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3345460299' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 23 10:41:02 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Jan 23 10:41:02 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1102334395' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 23 10:41:02 compute-2 nova_compute[225701]: 2026-01-23 10:41:02.924 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:41:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:41:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:41:03 compute-2 systemd[1]: Starting Hostname Service...
Jan 23 10:41:03 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Jan 23 10:41:03 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2385619581' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 23 10:41:03 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Jan 23 10:41:03 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/369562290' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 23 10:41:03 compute-2 systemd[1]: Started Hostname Service.
Jan 23 10:41:03 compute-2 ceph-mon[75771]: from='client.27484 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:03 compute-2 ceph-mon[75771]: pgmap v1441: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:41:03 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1637687014' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 23 10:41:03 compute-2 ceph-mon[75771]: from='client.17949 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:03 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/887272108' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 23 10:41:03 compute-2 ceph-mon[75771]: from='client.27278 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:03 compute-2 ceph-mon[75771]: from='client.27496 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:03 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/712991638' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 23 10:41:03 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/233774706' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 23 10:41:03 compute-2 ceph-mon[75771]: from='client.17961 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:03 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3110235279' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 23 10:41:03 compute-2 ceph-mon[75771]: from='client.27287 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:03 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3108936715' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 23 10:41:03 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1660365616' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 23 10:41:03 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/312762499' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 23 10:41:03 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/420082444' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 23 10:41:03 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:41:03 2026: (VI_0) received an invalid passwd!
Jan 23 10:41:03 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Jan 23 10:41:03 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3665764273' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 23 10:41:03 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Jan 23 10:41:03 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4166326524' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 23 10:41:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:41:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:41:04 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Jan 23 10:41:04 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2110475920' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 23 10:41:04 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Jan 23 10:41:04 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3690345591' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 23 10:41:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:41:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:41:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:04.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:41:04 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:41:04 2026: (VI_0) received an invalid passwd!
Jan 23 10:41:04 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:41:04 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 10:41:04 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:04.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 10:41:04 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Jan 23 10:41:04 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/924431456' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 10:41:05 compute-2 ceph-mon[75771]: from='client.27302 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:05 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3345460299' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 23 10:41:05 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1102334395' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 23 10:41:05 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2934036653' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 23 10:41:05 compute-2 ceph-mon[75771]: from='client.27314 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:05 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2385619581' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 23 10:41:05 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/369562290' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 23 10:41:05 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3236167764' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 23 10:41:05 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2272500806' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 23 10:41:05 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1644214128' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 23 10:41:05 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/222612964' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 23 10:41:05 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3665764273' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 23 10:41:05 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/4166326524' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 23 10:41:05 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1452223012' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 23 10:41:05 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3579800927' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 23 10:41:05 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2110475920' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 23 10:41:05 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2527938535' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 23 10:41:05 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/3690345591' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 23 10:41:05 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1233404613' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 23 10:41:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:41:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:41:05 compute-2 nova_compute[225701]: 2026-01-23 10:41:05.325 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:41:05 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:41:05 2026: (VI_0) received an invalid passwd!
Jan 23 10:41:05 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Jan 23 10:41:05 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/631640421' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 23 10:41:06 compute-2 ceph-mon[75771]: pgmap v1442: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:41:06 compute-2 ceph-mon[75771]: from='client.27326 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:06 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3136483563' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 23 10:41:06 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3117015144' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 23 10:41:06 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/674797361' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 23 10:41:06 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/924431456' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 10:41:06 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/4081738896' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 23 10:41:06 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2279399780' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 23 10:41:06 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3359288182' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 23 10:41:06 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2663519361' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 10:41:06 compute-2 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:41:06 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1474099025' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 23 10:41:06 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/265346978' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 23 10:41:06 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2144523971' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 23 10:41:06 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/4016238504' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 23 10:41:06 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3327562445' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 23 10:41:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:41:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:41:06 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:41:06 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Jan 23 10:41:06 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1856915728' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 23 10:41:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:41:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:41:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:06.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:41:06 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:41:06 2026: (VI_0) received an invalid passwd!
Jan 23 10:41:06 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:41:06 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:41:06 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:06.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:41:06 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Jan 23 10:41:06 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/444520388' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 23 10:41:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:41:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:41:07 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:41:07 2026: (VI_0) received an invalid passwd!
Jan 23 10:41:07 compute-2 ceph-mon[75771]: pgmap v1443: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:41:07 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/631640421' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 23 10:41:07 compute-2 ceph-mon[75771]: from='client.18081 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:07 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/142027931' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 23 10:41:07 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1856915728' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 23 10:41:07 compute-2 ceph-mon[75771]: from='client.18099 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:07 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1078464107' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 23 10:41:07 compute-2 ceph-mon[75771]: from='client.18105 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:07 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/444520388' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 23 10:41:07 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1979006449' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 10:41:07 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/516926078' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 23 10:41:07 compute-2 nova_compute[225701]: 2026-01-23 10:41:07.926 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:41:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:41:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:41:08 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Jan 23 10:41:08 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2866924100' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 23 10:41:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:41:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:41:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:08.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:41:08 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:41:08 2026: (VI_0) received an invalid passwd!
Jan 23 10:41:08 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:41:08 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:41:08 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:08.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:41:08 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Jan 23 10:41:08 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/704825705' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 23 10:41:08 compute-2 ceph-mon[75771]: from='client.18120 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:08 compute-2 ceph-mon[75771]: from='client.27637 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:08 compute-2 ceph-mon[75771]: from='client.27655 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:08 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2337187949' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 23 10:41:08 compute-2 ceph-mon[75771]: from='client.27416 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:08 compute-2 ceph-mon[75771]: from='client.27664 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:08 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2907186737' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 23 10:41:08 compute-2 ceph-mon[75771]: pgmap v1444: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:41:08 compute-2 ceph-mon[75771]: from='client.18147 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:08 compute-2 ceph-mon[75771]: from='client.27673 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:08 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/4288764517' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 23 10:41:08 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/1493572992' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 23 10:41:08 compute-2 ceph-mon[75771]: from='client.18165 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:08 compute-2 ceph-mon[75771]: from='client.27428 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:08 compute-2 ceph-mon[75771]: from='client.27691 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:08 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/1961194654' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 23 10:41:08 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2866924100' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 23 10:41:08 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/2109945965' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:41:08 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/4166585728' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 23 10:41:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:41:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:41:09 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:41:09 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:41:09 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Jan 23 10:41:09 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2035341214' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:41:09 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:41:09 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:41:09 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:41:09 2026: (VI_0) received an invalid passwd!
Jan 23 10:41:09 compute-2 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Jan 23 10:41:09 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1386926707' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 23 10:41:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:41:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:41:10 compute-2 ceph-mon[75771]: from='client.18177 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:10 compute-2 ceph-mon[75771]: from='client.27440 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:10 compute-2 ceph-mon[75771]: from='client.27703 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:10 compute-2 ceph-mon[75771]: from='client.27446 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:10 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/704825705' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 23 10:41:10 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:41:10 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:41:10 compute-2 ceph-mon[75771]: from='client.18195 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:10 compute-2 ceph-mon[75771]: from='client.27721 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:10 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:41:10 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:41:10 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:41:10 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:41:10 compute-2 ceph-mon[75771]: from='client.27461 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:10 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/2035341214' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:41:10 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:41:10 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:41:10 compute-2 ceph-mon[75771]: from='client.27739 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:10 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:41:10 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:41:10 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/4171598296' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 23 10:41:10 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:41:10 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:41:10 compute-2 ceph-mon[75771]: from='client.? 192.168.122.102:0/1386926707' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 23 10:41:10 compute-2 nova_compute[225701]: 2026-01-23 10:41:10.327 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:41:10 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:41:10 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:41:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:41:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:41:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:10.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:41:10 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:41:10 2026: (VI_0) received an invalid passwd!
Jan 23 10:41:10 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:41:10 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:41:10 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:10.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:41:10 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:41:10 compute-2 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:41:11 compute-2 sudo[253342]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:41:11 compute-2 sudo[253342]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:41:11 compute-2 sudo[253342]: pam_unix(sudo:session): session closed for user root
Jan 23 10:41:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:41:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:41:11 compute-2 sudo[253369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:41:11 compute-2 sudo[253369]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:41:11 compute-2 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:41:11 compute-2 ceph-mon[75771]: pgmap v1445: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:41:11 compute-2 ceph-mon[75771]: from='client.27473 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:11 compute-2 ceph-mon[75771]: from='client.27775 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:11 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:41:11 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:41:11 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/4243432763' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 23 10:41:11 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/3513160830' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 23 10:41:11 compute-2 ceph-mon[75771]: from='client.27497 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:11 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:41:11 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:41:11 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:41:11 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:41:11 compute-2 ceph-mon[75771]: from='client.18273 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:11 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:41:11 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:41:11 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:41:11 compute-2 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:41:11 compute-2 ceph-mon[75771]: from='client.? 192.168.122.101:0/2845937288' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:41:11 compute-2 ceph-mon[75771]: from='client.? 192.168.122.100:0/3453822127' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 23 10:41:11 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:41:11 2026: (VI_0) received an invalid passwd!
Jan 23 10:41:11 compute-2 sudo[253369]: pam_unix(sudo:session): session closed for user root
Jan 23 10:41:11 compute-2 sudo[253458]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:41:11 compute-2 sudo[253458]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:41:11 compute-2 sudo[253458]: pam_unix(sudo:session): session closed for user root
Jan 23 10:41:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:41:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:41:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 10:41:12 compute-2 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 10:41:12 compute-2 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:12.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 10:41:12 compute-2 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:41:12 2026: (VI_0) received an invalid passwd!
Jan 23 10:41:12 compute-2 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
